A strange smell enters your nose. At first it seems pleasant, but at some point it hits you that the smell doesn't normally belong there. You then recognize that it's most likely an air freshener, and your brain supplies the needed context; namely, that there was a stink here. This knowledge makes the air freshener somewhat less effective, doesn't it? Moreover, I think that air fresheners are mostly ineffective, and I will gladly explain why.
All air fresheners smell pleasant, and yet most of the time when you smell them, the scent actually comes off as annoying, or at least less-than-pleasant. Why is this the case? Doesn't a pleasant smell always smell pleasant? In fact, no. The connotation that comes with the smell indicates that something else likely happened to make the air freshener necessary.
Moreover, when would you choose an air freshener over fresh air? Only on a summer's day when it's too hot to open the windows, or a winter's day when it's too cold to do the same. Depending on where you live, these may vary in frequency. Either way, there will be some days where the air freshener just smells out of place, even if it wasn't there to cover up a stench.
In many cases the air freshener cannot even completely cover the previous odor. This is arguably worse than not using any air freshener at all. The rank smell mingles with the more pleasant aroma, creating a confusing and irritating environment. So either the air freshener couldn't fully eliminate the other scent (and therefore, pointless), or the other scent was already gone, and the air freshener only serves as an ironic reminder of the incident (again, pointless). It's ironic because the pleasant smell hints at something unpleasant. In either case, we see that the use of the air freshener, at least as a remedy, was completely futile.
Also there is something to be said about the oils. If given a choice, I'd prefer fresh air over freshened-air any time. Most air fresheners use oils which either drift down to the floor, or float around. Either kind introduces particles into the environment (albeit, pleasant-smelling ones) that would not have been there otherwise - at least, certainly not in the quantity dispersed. This may even have some negative health effects, but I've not researched the subject. I simply know that when I inhale directly from a thick fog of such particles, the effect is not exactly stimulating, but rather, cough- and sneeze-inducing.
Don't get me wrong; I'm all for the use of air fresheners. However, I don't kid myself into thinking that they are effective. At the very least, all they do is kick your olfactory into gear and remind you that "there was a stink here."
This blog is basically food for thought. Possible topics include, but are not limited to, language, advanced mathematics, combinatory logic, games, philosophy, computers, poetry, music, and finally, science. Naturally, food is also a common undercurrant (as are puns and other overtures at humor ranging from mild to insane to groaners). Updates on Tuesdays.
Tuesday, July 17, 2012
Tuesday, July 10, 2012
Music For Nerds
In a similar vein as the "For Dummies" series, this post resides in the highly underdeveloped genre known as "For Nerds." The topic today is music! I am no expert at music, but I have my fair share of qualifications, particularly for someone who never majored in it, or otherwise, studied it intensely. I have played trumpet for over 10 years, piano for about 2 years, and dabbled in guitar until my cheapo First Act guitar broke (which wasn't very long). I also compose music, primarily solo piano and electronic. For those who are curious, I do my mixes in FL Studio Producer Edition (FL = FruityLoops), and my piano stuff in Finale. This was all self-taught during high school and college, but I've done little in that area since my career in software development officially began. As far as music theory, for seven out of eight of my college semesters, my roommate was a music education major. I absorbed no small amount of musical knowledge this way, second-hand.
Added to that, my own studies (going through all the lessons on musictheory.net in one night) provided a somewhat unstable basis for learning the actual theory. I compose primarily by ear, which may not be the best way, but it works for me. All that aside, I'd like to share some simple tricks for just about anyone. If you're a nerd like me, you will hopefully find these tricks a bit easier to remember than the old-school methods. The three tricks are how to quickly read a key signature, how to quickly read a time signature, and how to quickly read bass clef. A fourth tip is the simplest explanation I could muster for the idea of concert pitch, something that confused me to no end until recently.
Overview
Here is an overview of how to read music, what all the clef symbols and notes mean, and some of the other basic symbols on the notes.
Pitch and Clefs
Pitch is determined by the "height" of the note on the staff, and further, the staff's clef and location relative to other clefs. The higher the dot on the note, the higher the pitch. Don't look at the stems because they can point up or down, so that may be confusing (the stem is the line emerging from the dot).
For instance, on a piano score, you will see the treble clef, connected by a long vertical bar on the left side to the bass clef. The notes at the top of the treble clef are highest, and the notes at the bottom of the bass clef are lowest. The treble clef looks like a cursive letter S with a big dot at one end, but the important part is that the curl in the middle ends on the note line for G. The bass clef looks like a backwards C with two small dots to the upper right. The main shape also curls into a larger dot in the middle, a dot which would be the note F. Thus, these two clefs are often nicknamed G-clef and F-clef. Looking at their dots is one quick way to identify those particular notes.
The other clefs are actually even easier to read, because there is only one, and its position on the staff tells you where middle C is. It looks like a double thick bar on the left, with a curly bracket thing on the right (the entire thing looks like the letter B). The bracket has a sharp point going left, and this sharp point will always be on a line. That line is always middle C. Thus, this is the C-clef and is used mostly for vocal parts, except where hymns are concerned, since hymns normally have the same 4 voices (alto, tenor, baritone, and bass) and are written all on one page, using only the treble and bass clefs. When a single vocal part (say, for opera) is written on its own page, then the C clef is normally used to indicate which voice part it is.
Rhythm
The rhythm is largely determined by the type of dot in each note. The single 'beat' - the most basic unit in music - is normally the quarter-note, shown as a filled dot. Half notes and whole notes are unfilled dots. A half note is the one with the stem, and it gets two beats, while the whole note has no stem, and gets four beats. In common time or any time signature with a number 4 on the bottom, there are four beats in each measure. The small vertical lines in the staff divide the measures for easy reading.
If a note's stem has a tail on it, then it is an eighth note (if there is a tail, the dot will always be filled). A double tail indicates an even shorter note, the sixteenth. A triple tail indicates a thirdy-second note, and so on. Generally, the number used tells you what fraction of a measure the note represents, at least for common time. Thus, a thirty-second note is so fast, you could fit 32 of them into a single measure!
If two notes with a tail (eighths or shorter) are next to each other, the tails are connected, forming bars instead. Similarly, if four such notes are found together, all four tails are connected into one long bar. If the tails were double tails (for sixteenth notes) then there will be double bars instead.
A note with a smaller dot just to its right is a tricky one. You can think of this type of note or rhythm as working overtime - it gets time and a half! A dotted quarter note gets a quarter of the beat, plus half of that same value. Half of a quarter is an eighth, so a dotted quarter gets a quarter beat plus another eighth beat. It makes a lot more sense when you hear it, but the simplest way to think of it is two separate notes tied together. It sounds the same - a tie is the curvy line connecting two adjacent notes of the same pitch. Ties are normally used to connect notes across measures, since it's usually cleaner to combine them if they are in the same measure.
A slur is just a tie between notes of different pitches. This indicates that the second note isn't as distinct as the first, giving a blurred or slurred sound between them.
Finally, the accidentals - the things that make music look the most confusing! They are certainly the trickiest part since they occur when there is a modulation (when the notes or key at that spot don't match the overall key of the song). There are only three different symbols - one looking like a lowercase letter B, one that is the pound sign or hash (#) and a third that looks like a box with lines extending from two of its corners (almost like the hash with six of its 'arms' cut off). The first is the flat, causing the note to its right to go down one half-step. The second is the sharp, which does the opposite - the note it modifies goes up a half-step. Finally, the most strange looking one is the natural - it means the note is neither sharp nor flat, and assumes its normal pitch for that location in the staff. There are also double flats and double sharps, but I think that's getting a bit advanced for this post.
Quick Read Tips
Time Signature
Now, here is the quick-read tip for the time signature. In case you can't find the time signature, it is usually on the very left side of each staff, just to the right of the clef symbol. It is two small numbers, one on top of the other, both wedged inside the lines of the staff. Sometimes (if the time signature changes during the song) you may also see a time signature just before a double bar.
The top number tells you the length of each measure. If it says 4, there are four beats between each of the small vertical lines dividing the measures. If it says 8, there are eight beats instead. However, the beat is defined by the bottom number, which tells you what note is considered a single beat.
I still haven't gotten to the actual trick yet. I find that examples work best - let's say you want to understand how 3/4 time relates to 6/8 time. The two are actually very similar!
Think of it in terms of math (because that's what musical rhythm boils down to). Reduce fractions, and 6/8 actually equals 3/4. What does that tell you? The ratio between the measure length and the beat is the same! And what does that tell you? Well, for one, you can conduct 6/8 in much the same way as 3/4, waving your arms in a triangle-like shape. Second, you can tell that 6/8 is 3/4 times 2/2 which means everything is doubled. Where in 3/4 you only have three quarter-notes per measure, in 6/8 you have six eighth notes. Hey, that matches up - 3/4 = three quarter-notes, 6/8 = six eighth-notes. BAM! There's your trick. 2/4 equals two quarters (two quarter notes per measure). 9/4 equals nine quarter-notes.
Time signatures ARE fractions. Treating them like anything else only makes things confusing. Treat them like fractions, and everything will make sense from now on. But wait, that's math, not music! Oh, right - for nerds math is cool, so this actually works out better! Win.
Key Signature
Now, key signatures are a little more complicated, as you may know. Since there are twelve half-steps (distinct notes, including sharps and flats) in an octave, the combinations of which sharps and flats you have can become quite complex. One thing is on your side - except for accidentals, you will always have either all sharps or all flats. There is no key signature with both sharps and flats. The only "odd man out" is the key signature with neither sharps nor flats.
Another thing that adds to the complexity is that each key signature can be either major or minor, and it may not be obvious at a glance which one is being used. In addition to that, there are flat and sharp versions of each major and minor scale. This can all be tabulated and calculated quite easily by counting the number of sharp or flat symbols in the key signature, and then using the following chart:
0 2 4 -1 1 3 5
At first, these numbers probably seem like they could serve no useful purpose. The fact is, they already represent the seven natural keys! And the simplest part is they begin with C and proceed in alphabetical order. In music, nothing goes past G, so starting from zero you have:
C = 0
D = 2
E = 4
F = -1
G = 1
A = 3
B = 5
Now, what do the numbers mean? It's quite simple: positive numbers represent sharps, and negative numbers represent flats, while zero means no sharps or flats.
For example, if you see a key signature with two sharps, it's either D major, or the relative minor for that same key (which also has two sharps).
Determining the relative minor is just as easy! Add three. D is 2, so D major's relative minor is 2 plus 3 or 5, which is B. So the key signature with 2 sharps is either D major or B minor.
What about sharp and flat scales? G sharp (G#) major and A flat (Ab) minor and so forth? Amazingly, it gets easier yet! Simply subtract or add seven. Since flats are negative, you subtract seven to get the flat version of the key, and you add seven to get the sharp one. Ab is 3 minus 7, which is -4, so that key signature has four flats.
You can also calculate minor keys directly by subtracting three. For instance, what would C minor be? Take the value for C (zero) and subtract three. You get -3, which indicates that C minor has three flats.
Time for another example: suppose you want to look at G# major. Here, an exception occurs. We end up with 1 + 7 or 8 sharps. However, the key of G# is never used; it is purely theoretical. If a composer were to want that exact key, he or she would use the enharmonic equivalent key of Ab major instead, rather than using a double sharp and six sharps to indicate the proper eight-sharp key of G#.
In some cases you may have to add or subtract twice. For instance, what about A# minor? First, you add seven to the value of A (3) to give you A# major, which is 10. Then, you subtract three for the minor, which gives you 7. Thus, A# minor has seven sharps.
One last example: if you see three flats, well this isn't on the chart, is it? But you know one possibility - add three and you end up with zero, which is C. So the relative minor is C minor, which could be the key. Another way to get -3 is subtract 7 from 4, which is how you calculate Eb major. The key is thus either Eb major or C minor.
Credit for this system goes to musictheory.net, though it doesn't mention the detail that you can add three to get the relative minor.
Staves and Clefs
Lastly, understanding where all the notes are, independent of the clefs and staves, will greatly aid your ability to read different clefs. This section focuses only on the treble and bass clefs.
I'll assume that if you do know how to read music, you probably grew up playing a musical instrument. Unless you picked the baritone, tuba, or trombone, you are probably already familiar with the treble clef, and completely unfamiliar with the bass clef. Either way, the following should help you read the other clef a bit easier.
First, note that middle C is precisely and exactly that, when it comes to the treble and bass clefs. Treble-pitched instrument players will know that middle C resides on the first ledger line below the treble clef staff. Bass-pitched instrument players will know that middle C resides on the first ledger line above the bass clef staff. I bet both types were surprised to learn this about the other clef! In fact, let's say you situated the clefs together so that the lines match up (so the treble clef lines blend perfectly into the bass clef lines as you move downward, and vice versa). There would in this case be exactly one ledger line between both staves, with only one space in between on either side. Notes on this line would of course be middle C! How much simpler could it get?
One major helpful thing to note is that, for most notes, line notes in the treble clef become space notes in the bass clef, and vice versa. So the note G in the treble clef, a line note on the second-lowest line, becomes a space note in the bass clef, in the topmost space. Don't think about this too hard, as there are cases where it fails, but it's another handy thing to remember.
The major tip I can give you here is that the space notes (going from bottom to top) in the treble clef spell out the word FACE, and in the bass clef, they instead form the acronym ACEG (all cows eat grass). If you can just remember these two words (face = treble, bass = all cows eat grass) then you can use the space notes to go up and down and determine the other notes from there.
Tuning Systems
Concert Pitch
Concert pitch is the system in which most music is written today. Let's say I have a trumpet tuned to B flat and a cornet tuned to C. These tunings are also in concert pitch. The primary goal of this system is to prevent the difficulties that arise when the same note can have different fingerings depending on the key. With concert pitch, as long as you know the fingerings for each note, you don't have to worry what key you're in when reading the music (other than pressing all the right fingerings for that key and any accidentals).
If I am playing the same piece of music written in concert C, it will sound different depending on whether I play it on the trumpet or cornet. If I play it on the trumpet, I am actually playing in concert B flat. I wouldn't hear the difference if playing by myself, but if I played along with someone else who was playing a C cornet, it would sound quite awful indeed. So why doesn't a concert full of instruments all pitched differently sound terrible?
Relative Pitch
The trick here is all in the writing. Solo pieces can be written in relative pitch, meaning your only limitation is to write within that instrument's range. What you hear may not always be the right key, but it will sound good by itself because you have no other parts behind it for a reference. It is often written out of key (transposed) to put more of the notes within the staff and make it easier to read, but without a tuning fork or a highly trained ear, you wouldn't know that it's out of key. With no other instruments playing (no reference), all the pitches would still sound correct, relative to each other. Being out of key only makes the overall pitch of the song higher or lower than normal.
Transposing to Concert Pitch
By contrast, for ensembles and concert pieces, each part is normally written transposed to the right pitch for each instrument, using the concert pitch system. Thus, B flat trumpet parts are written in B flat concert, transposed with the key of the song. If the key is C, for instance, all notes must be transposed up a half-step. Now, by playing these transposed notes, a B flat tuned instrument is playing in C concert pitch. Likewise, for an F-tuned instrument, the notes would be transposed down five half-steps for a song written in concert C. If the key is concert B flat, then a trumpet B flat part would not need transposed at all. A notation such as "horn in F" would indicate that F is the concert pitch tuning for that instrument's part. Incidentally, F is also the standard concert pitch for horns, as is B flat for clarinets, and so forth.
Transposing Instruments
Some instruments, on the other hand, have such high or low ranges that their music always needs transposed simply to avoid having the notes so far away from the staff, requiring many ledger lines. Such instruments can be kept in the same key, but transposed up or down by one or more octaves. Examples of instruments that must always be transposed due to their range are the piccolo and the contrabassoon.
Besides extreme ranges, there are other reasons why instruments might be written transposed from concert pitch, but all such instruments written this way are known as transposing instruments. For these, concert pitch is not used as the standard. Instruments that use concert pitch are known as non-transposing instruments.
Transposed Reading
There are certain skilled individuals who can read and play music outside of concert pitch. This means they have to use a different fingering depending on which key the music is in, in order to match the correct concert pitch. If I want to play a part labeled "trumpet in B flat" but I want to actually play tuned to C concert, for instance, I would have to do the transposition myself as I am reading the music, playing each note a half-step higher than what is written. It would be far easier, however, just to obtain a "trumpet in C" part for that song!
Added to that, my own studies (going through all the lessons on musictheory.net in one night) provided a somewhat unstable basis for learning the actual theory. I compose primarily by ear, which may not be the best way, but it works for me. All that aside, I'd like to share some simple tricks for just about anyone. If you're a nerd like me, you will hopefully find these tricks a bit easier to remember than the old-school methods. The three tricks are how to quickly read a key signature, how to quickly read a time signature, and how to quickly read bass clef. A fourth tip is the simplest explanation I could muster for the idea of concert pitch, something that confused me to no end until recently.
Overview
Here is an overview of how to read music, what all the clef symbols and notes mean, and some of the other basic symbols on the notes.
Pitch and Clefs
Pitch is determined by the "height" of the note on the staff, and further, the staff's clef and location relative to other clefs. The higher the dot on the note, the higher the pitch. Don't look at the stems because they can point up or down, so that may be confusing (the stem is the line emerging from the dot).
For instance, on a piano score, you will see the treble clef, connected by a long vertical bar on the left side to the bass clef. The notes at the top of the treble clef are highest, and the notes at the bottom of the bass clef are lowest. The treble clef looks like a cursive letter S with a big dot at one end, but the important part is that the curl in the middle ends on the note line for G. The bass clef looks like a backwards C with two small dots to the upper right. The main shape also curls into a larger dot in the middle, a dot which would be the note F. Thus, these two clefs are often nicknamed G-clef and F-clef. Looking at their dots is one quick way to identify those particular notes.
The other clefs are actually even easier to read, because there is only one, and its position on the staff tells you where middle C is. It looks like a double thick bar on the left, with a curly bracket thing on the right (the entire thing looks like the letter B). The bracket has a sharp point going left, and this sharp point will always be on a line. That line is always middle C. Thus, this is the C-clef and is used mostly for vocal parts, except where hymns are concerned, since hymns normally have the same 4 voices (alto, tenor, baritone, and bass) and are written all on one page, using only the treble and bass clefs. When a single vocal part (say, for opera) is written on its own page, then the C clef is normally used to indicate which voice part it is.
Rhythm
The rhythm is largely determined by the type of dot in each note. The single 'beat' - the most basic unit in music - is normally the quarter-note, shown as a filled dot. Half notes and whole notes are unfilled dots. A half note is the one with the stem, and it gets two beats, while the whole note has no stem, and gets four beats. In common time or any time signature with a number 4 on the bottom, there are four beats in each measure. The small vertical lines in the staff divide the measures for easy reading.
If a note's stem has a tail on it, then it is an eighth note (if there is a tail, the dot will always be filled). A double tail indicates an even shorter note, the sixteenth. A triple tail indicates a thirdy-second note, and so on. Generally, the number used tells you what fraction of a measure the note represents, at least for common time. Thus, a thirty-second note is so fast, you could fit 32 of them into a single measure!
If two notes with a tail (eighths or shorter) are next to each other, the tails are connected, forming bars instead. Similarly, if four such notes are found together, all four tails are connected into one long bar. If the tails were double tails (for sixteenth notes) then there will be double bars instead.
A note with a smaller dot just to its right is a tricky one. You can think of this type of note or rhythm as working overtime - it gets time and a half! A dotted quarter note gets a quarter of the beat, plus half of that same value. Half of a quarter is an eighth, so a dotted quarter gets a quarter beat plus another eighth beat. It makes a lot more sense when you hear it, but the simplest way to think of it is two separate notes tied together. It sounds the same - a tie is the curvy line connecting two adjacent notes of the same pitch. Ties are normally used to connect notes across measures, since it's usually cleaner to combine them if they are in the same measure.
A slur is just a tie between notes of different pitches. This indicates that the second note isn't as distinct as the first, giving a blurred or slurred sound between them.
Finally, the accidentals - the things that make music look the most confusing! They are certainly the trickiest part since they occur when there is a modulation (when the notes or key at that spot don't match the overall key of the song). There are only three different symbols - one looking like a lowercase letter B, one that is the pound sign or hash (#) and a third that looks like a box with lines extending from two of its corners (almost like the hash with six of its 'arms' cut off). The first is the flat, causing the note to its right to go down one half-step. The second is the sharp, which does the opposite - the note it modifies goes up a half-step. Finally, the most strange looking one is the natural - it means the note is neither sharp nor flat, and assumes its normal pitch for that location in the staff. There are also double flats and double sharps, but I think that's getting a bit advanced for this post.
Quick Read Tips
Time Signature
Now, here is the quick-read tip for the time signature. In case you can't find the time signature, it is usually on the very left side of each staff, just to the right of the clef symbol. It is two small numbers, one on top of the other, both wedged inside the lines of the staff. Sometimes (if the time signature changes during the song) you may also see a time signature just before a double bar.
The top number tells you the length of each measure. If it says 4, there are four beats between each of the small vertical lines dividing the measures. If it says 8, there are eight beats instead. However, the beat is defined by the bottom number, which tells you what note is considered a single beat.
I still haven't gotten to the actual trick yet. I find that examples work best - let's say you want to understand how 3/4 time relates to 6/8 time. The two are actually very similar!
Think of it in terms of math (because that's what musical rhythm boils down to). Reduce fractions, and 6/8 actually equals 3/4. What does that tell you? The ratio between the measure length and the beat is the same! And what does that tell you? Well, for one, you can conduct 6/8 in much the same way as 3/4, waving your arms in a triangle-like shape. Second, you can tell that 6/8 is 3/4 times 2/2 which means everything is doubled. Where in 3/4 you only have three quarter-notes per measure, in 6/8 you have six eighth notes. Hey, that matches up - 3/4 = three quarter-notes, 6/8 = six eighth-notes. BAM! There's your trick. 2/4 equals two quarters (two quarter notes per measure). 9/4 equals nine quarter-notes.
Time signatures ARE fractions. Treating them like anything else only makes things confusing. Treat them like fractions, and everything will make sense from now on. But wait, that's math, not music! Oh, right - for nerds math is cool, so this actually works out better! Win.
Key Signature
Now, key signatures are a little more complicated, as you may know. Since there are twelve half-steps (distinct notes, including sharps and flats) in an octave, the combinations of which sharps and flats you have can become quite complex. One thing is on your side - except for accidentals, you will always have either all sharps or all flats. There is no key signature with both sharps and flats. The only "odd man out" is the key signature with neither sharps nor flats.
Another thing that adds to the complexity is that each key signature can be either major or minor, and it may not be obvious at a glance which one is being used. In addition to that, there are flat and sharp versions of each major and minor scale. This can all be tabulated and calculated quite easily by counting the number of sharp or flat symbols in the key signature, and then using the following chart:
0 2 4 -1 1 3 5
At first, these numbers probably seem like they could serve no useful purpose. The fact is, they already represent the seven natural keys! And the simplest part is they begin with C and proceed in alphabetical order. In music, nothing goes past G, so starting from zero you have:
C = 0
D = 2
E = 4
F = -1
G = 1
A = 3
B = 5
Now, what do the numbers mean? It's quite simple: positive numbers represent sharps, and negative numbers represent flats, while zero means no sharps or flats.
For example, if you see a key signature with two sharps, it's either D major, or the relative minor for that same key (which also has two sharps).
Determining the relative minor is just as easy! Add three. D is 2, so D major's relative minor is 2 plus 3 or 5, which is B. So the key signature with 2 sharps is either D major or B minor.
What about sharp and flat scales? G sharp (G#) major and A flat (Ab) minor and so forth? Amazingly, it gets easier yet! Simply subtract or add seven. Since flats are negative, you subtract seven to get the flat version of the key, and you add seven to get the sharp one. Ab is 3 minus 7, which is -4, so that key signature has four flats.
You can also calculate minor keys directly by subtracting three. For instance, what would C minor be? Take the value for C (zero) and subtract three. You get -3, which indicates that C minor has three flats.
Time for another example: suppose you want to look at G# major. Here, an exception occurs. We end up with 1 + 7 or 8 sharps. However, the key of G# is never used; it is purely theoretical. If a composer were to want that exact key, he or she would use the enharmonic equivalent key of Ab major instead, rather than using a double sharp and six sharps to indicate the proper eight-sharp key of G#.
In some cases you may have to add or subtract twice. For instance, what about A# minor? First, you add seven to the value of A (3) to give you A# major, which is 10. Then, you subtract three for the minor, which gives you 7. Thus, A# minor has seven sharps.
One last example: if you see three flats, well this isn't on the chart, is it? But you know one possibility - add three and you end up with zero, which is C. So the relative minor is C minor, which could be the key. Another way to get -3 is subtract 7 from 4, which is how you calculate Eb major. The key is thus either Eb major or C minor.
Credit for this system goes to musictheory.net, though it doesn't mention the detail that you can add three to get the relative minor.
Staves and Clefs
Lastly, understanding where all the notes are, independent of the clefs and staves, will greatly aid your ability to read different clefs. This section focuses only on the treble and bass clefs.
I'll assume that if you do know how to read music, you probably grew up playing a musical instrument. Unless you picked the baritone, tuba, or trombone, you are probably already familiar with the treble clef, and completely unfamiliar with the bass clef. Either way, the following should help you read the other clef a bit easier.
First, note that middle C is precisely and exactly that, when it comes to the treble and bass clefs. Treble-pitched instrument players will know that middle C resides on the first ledger line below the treble clef staff. Bass-pitched instrument players will know that middle C resides on the first ledger line above the bass clef staff. I bet both types were surprised to learn this about the other clef! In fact, let's say you situated the clefs together so that the lines match up (so the treble clef lines blend perfectly into the bass clef lines as you move downward, and vice versa). There would in this case be exactly one ledger line between both staves, with only one space in between on either side. Notes on this line would of course be middle C! How much simpler could it get?
One major helpful thing to note is that, for most notes, line notes in the treble clef become space notes in the bass clef, and vice versa. So the note G in the treble clef, a line note on the second-lowest line, becomes a space note in the bass clef, in the topmost space. Don't think about this too hard, as there are cases where it fails, but it's another handy thing to remember.
The major tip I can give you here is that the space notes (going from bottom to top) in the treble clef spell out the word FACE, and in the bass clef, they instead form the acronym ACEG (all cows eat grass). If you can just remember these two words (face = treble, bass = all cows eat grass) then you can use the space notes to go up and down and determine the other notes from there.
Tuning Systems
Concert Pitch
Concert pitch is the system in which most music is written today. Let's say I have a trumpet tuned to B flat and a cornet tuned to C. These tunings are also in concert pitch. The primary goal of this system is to prevent the difficulties that arise when the same note can have different fingerings depending on the key. With concert pitch, as long as you know the fingerings for each note, you don't have to worry what key you're in when reading the music (other than pressing all the right fingerings for that key and any accidentals).
If I am playing the same piece of music written in concert C, it will sound different depending on whether I play it on the trumpet or cornet. If I play it on the trumpet, I am actually playing in concert B flat. I wouldn't hear the difference if playing by myself, but if I played along with someone else who was playing a C cornet, it would sound quite awful indeed. So why doesn't a concert full of instruments all pitched differently sound terrible?
Relative Pitch
The trick here is all in the writing. Solo pieces can be written in relative pitch, meaning your only limitation is to write within that instrument's range. What you hear may not always be the right key, but it will sound good by itself because you have no other parts behind it for a reference. It is often written out of key (transposed) to put more of the notes within the staff and make it easier to read, but without a tuning fork or a highly trained ear, you wouldn't know that it's out of key. With no other instruments playing (no reference), all the pitches would still sound correct, relative to each other. Being out of key only makes the overall pitch of the song higher or lower than normal.
Transposing to Concert Pitch
By contrast, for ensembles and concert pieces, each part is normally written transposed to the right pitch for each instrument, using the concert pitch system. Thus, B flat trumpet parts are written in B flat concert, transposed with the key of the song. If the key is C, for instance, all notes must be transposed up a half-step. Now, by playing these transposed notes, a B flat tuned instrument is playing in C concert pitch. Likewise, for an F-tuned instrument, the notes would be transposed down five half-steps for a song written in concert C. If the key is concert B flat, then a trumpet B flat part would not need transposed at all. A notation such as "horn in F" would indicate that F is the concert pitch tuning for that instrument's part. Incidentally, F is also the standard concert pitch for horns, as is B flat for clarinets, and so forth.
Transposing Instruments
Some instruments, on the other hand, have such high or low ranges that their music always needs transposed simply to avoid having the notes so far away from the staff, requiring many ledger lines. Such instruments can be kept in the same key, but transposed up or down by one or more octaves. Examples of instruments that must always be transposed due to their range are the piccolo and the contrabassoon.
Besides extreme ranges, there are other reasons why instruments might be written transposed from concert pitch, but all such instruments written this way are known as transposing instruments. For these, concert pitch is not used as the standard. Instruments that use concert pitch are known as non-transposing instruments.
Transposed Reading
There are certain skilled individuals who can read and play music outside of concert pitch. This means they have to use a different fingering depending on which key the music is in, in order to match the correct concert pitch. If I want to play a part labeled "trumpet in B flat" but I want to actually play tuned to C concert, for instance, I would have to do the transposition myself as I am reading the music, playing each note a half-step higher than what is written. It would be far easier, however, just to obtain a "trumpet in C" part for that song!
Tuesday, July 3, 2012
Will Program For Twenty Cents
The subject of this post is actually programming, and another word after it which makes the post title a geeky pun. Bonus points for those who can figure it out! (Hint: You won't find the word anywhere in this post. I took extra effort not to use it).
From the first time I looked at computer code, I was fascinated by it. Ironically, during elementary school, I didn't think I'd ever be smart enough to program computers. This may have been because I wasn't smart enough at that point - the only intellectual limit of childhood seems to be one's inability to think farther than a week ahead. Better understanding your own potential for growth is part of maturing into an adult, I have found.
When personal computers first became available, their use was not widespread. Rather, the internet was simply a network of the major computers that existed at that time, and primarily between research facilities. Programming at that time would have been a headache compared to what it is today. Since binary (or more generally, anything digital) is nearly the opposite of how the human brain works (everything is analog), computer instructions then were written in hexadecimal, a number system on the other end of the scale from binary. Binary uses two digits; hexadecimal uses sixteen. In comparison, the number system we write with uses ten digits, and is simply called decimal.
Even in hexadecimal, computer code is all just numbers, but in a form more easily usable by humans. This is because, due to having eight times more digits than binary, a very long number in binary becomes a very short number in hexadecimal, meaning a lot more code can be shown with very few digits. This was the first form of software programming; hardware programming previously used switches that had to be set by hand.
Next came spaghetti code. The mental picture is fairly accurate - instructions just thrown in wherever they were deemed necessary. There was no real structure or organization at all. Each instruction was directly mapped to an address; the address was actually part of the programming code. The addresses had to be in order, but you could skip ones you didn't need, or decided not to use. You can see how it got its name with this kind of ad-hoc arrangement! However, the one improvement was that actual words could be used instead of codes. This introduced the need for another program, called a compiler, to come along after you write the code and turn the words into the hexadecimal and/or binary instructions that the computer can execute.
The first real structure came with the invention of - you guessed it - 'structured' code. The new idea here was to cut up the spaghetti into logical segments. Each segment, also known as a 'routine' or 'subroutine' was given a name, usually one that described what that portion of the instructions did. For instance, you might have a routine to display text on the screen, and another one to ask the user for input. In this way, instructions were organized by function, rather than being all thrown together in one big monolithic mess. In addition, this introduced the idea of parameters. In the case of a routine that displayed text on the screen, a parameter could be the text to be displayed. Whenever you invoke or call the routine (so that it performs the instructions contained therein) these parameters are given. This way, you need not reinvent the wheel and write the same code over and over each time you want to display text on the screen. You just call the routine that does it for you, and pass the text you want displayed as a parameter. This is also known as functional or procedural programming, because it is organized by function (routines are also known as functions or procedures).
This phase lasted quite a while until the next revolution: object-oriented programming. This provided not only further structure, but also several important, new concepts that would change the way programming was thought of, and what it was capable of doing. These powerful new tools created quite a stir and made computer code far more elegant and interesting. The three primary concepts are: encapsulation, inheritance, and polymorphism. All three fall under the umbrella term "abstraction" since they all give us new ways to represent abstract ideas (or objects) such as bank accounts, transactions, and other such things managed by computers, using computer code. This means the code is structured in a way that more accurately represents these objects, and therefore, more accurately handles and manages them.
Encapsulation is the idea of the black box. Think of a car engine, for instance. Many people haven't the foggiest notion of how a combustion engine works (perhaps a better example is how an airplane stays up in the air, since even fewer seem to understand that secret). However, that isn't a very big problem, unless of course, your car breaks down (or the airplane crashes). As you drive the car, it doesn't (and shouldn't) concern you what happens when you press the gas pedal, or the brakes. When you turn the steering wheel, you don't care how the engine responds and decides which way to turn the car. It doesn't matter to you, because it's a black box system. As long as it does its job, the black box can remain a mystery, and there is absolutely no problem with that.
We can do precisely this same thing with computer software. We can now write a portion of code that can interact in a well-defined way (known as an API) with other code. We can bundle up the code we wrote, sell it to someone else, and then they can write code on top of it that turns the steering wheel and pushes the pedals, so to speak. They don't care how our code works; it just works. When they turn the steering wheel, the car turns, and when they push the gas pedal, the car moves forward.
Encapsulation is accomplished in the software world by defining the scope of program elements. The scope tells us where in the program (and outside) we can see those elements. Functions, as mentioned earlier, are one such element. Stored data is the other primary element. We can define something as public (viewable by everyone, everywhere) or private (viewable only within the black box). This allows us to share the information we want, and protect the information that shouldn't be shared within the black box.
Inheritance is a lot simpler; you are already familiar with one form of it - genetics. In programming, inheritance works exactly the same way. We can write generic code that acts like an Animal - it has behaviors (defined by functions) such as speak, play, sleep, and so on. Then, we can write more specific code that acts like a Dog, but inherits the more generic aspects that it shares with all Animals. All animals can speak, but when a Dog speaks, the behavior can be defined specifically as "Bark." We could then write a Cat which inherits this same behavior (speaking) but again, when we invoke the Cat's 'speak' function, instead we receive a "Meow" in response.
Finally, polymorphism is the most complex of the three. It's quite a difficult concept to wrap your mind around, even if you're a programmer. However, the simplest way to explain it was already done in the last paragraph. It is closely related to inheritance. When a Cat speaks and we hear a "Meow" then a Dog speaks and we hear a "Bark," this is an example of polymorphism. In either case, we are simply invoking the inherited "speak" function - but the behavior is different depending on the subclass (Cat or Dog). This is polymorphism - the ability to define a specific response to a generic behavior.
In essence, these abstractions give us two dimensions in which to program. With structured design, a function always does the same thing every time you call it. With object-oriented design, polymorphism gives us a different response based on both the function/behavior and the object/idea. Invoking the same function on a different object normally produces different results.
Now, prepare your mind for some extreme warping - we are now in the age of subject-oriented programming, where we can wield three such dimensions. The result or response we get from a function can now be defined by the name of the function, the object on which it is invoked, and the subject related to the invocation. For instance, my Dog might have a different bark depending on whether he was happy to see me, or whether he was trying to fend off an intruder. This constitutes the subject, or aspect, in which the behavior is invoked.
Aspect-oriented programming is very similar to subject-oriented, but to spare your mind further warpage, I won't go into any detail on the differences between the two. Instead, I will just say that programming has come a long way from using hexadecimal codes and command line interfaces. We now have the power to determine the software's behavior based on the desired function, the object performing the action, and the context in which the interaction occurs. This gives incredible potential even just for artificial intelligence. Computer code that can update and edit itself is now just around the corner.
And yet, DNA has been doing exactly that for thousands of years. Is that something that occurred by random chance? I think it's about as likely as computers assembling and programming themselves out of thin air. It takes a mind more intelligent than a computer to design and build a computer. By the same token, it takes a mind more intelligent than a human mind to create such an incomprehensibly vast universe, and to populate it with beings whose bodies themselves contain technology far more advanced than computers; least of all, the human mind itself.
From the first time I looked at computer code, I was fascinated by it. Ironically, during elementary school, I didn't think I'd ever be smart enough to program computers. This may have been because I wasn't smart enough at that point - the only intellectual limit of childhood seems to be one's inability to think farther than a week ahead. Better understanding your own potential for growth is part of maturing into an adult, I have found.
When personal computers first became available, their use was not widespread. Rather, the internet was simply a network of the major computers that existed at that time, and primarily between research facilities. Programming at that time would have been a headache compared to what it is today. Since binary (or more generally, anything digital) is nearly the opposite of how the human brain works (everything is analog), computer instructions then were written in hexadecimal, a number system on the other end of the scale from binary. Binary uses two digits; hexadecimal uses sixteen. In comparison, the number system we write with uses ten digits, and is simply called decimal.
Even in hexadecimal, computer code is all just numbers, but in a form more easily usable by humans. This is because, due to having eight times more digits than binary, a very long number in binary becomes a very short number in hexadecimal, meaning a lot more code can be shown with very few digits. This was the first form of software programming; hardware programming previously used switches that had to be set by hand.
Next came spaghetti code. The mental picture is fairly accurate - instructions just thrown in wherever they were deemed necessary. There was no real structure or organization at all. Each instruction was directly mapped to an address; the address was actually part of the programming code. The addresses had to be in order, but you could skip ones you didn't need, or decided not to use. You can see how it got its name with this kind of ad-hoc arrangement! However, the one improvement was that actual words could be used instead of codes. This introduced the need for another program, called a compiler, to come along after you write the code and turn the words into the hexadecimal and/or binary instructions that the computer can execute.
The first real structure came with the invention of - you guessed it - 'structured' code. The new idea here was to cut up the spaghetti into logical segments. Each segment, also known as a 'routine' or 'subroutine' was given a name, usually one that described what that portion of the instructions did. For instance, you might have a routine to display text on the screen, and another one to ask the user for input. In this way, instructions were organized by function, rather than being all thrown together in one big monolithic mess. In addition, this introduced the idea of parameters. In the case of a routine that displayed text on the screen, a parameter could be the text to be displayed. Whenever you invoke or call the routine (so that it performs the instructions contained therein) these parameters are given. This way, you need not reinvent the wheel and write the same code over and over each time you want to display text on the screen. You just call the routine that does it for you, and pass the text you want displayed as a parameter. This is also known as functional or procedural programming, because it is organized by function (routines are also known as functions or procedures).
This phase lasted quite a while until the next revolution: object-oriented programming. This provided not only further structure, but also several important, new concepts that would change the way programming was thought of, and what it was capable of doing. These powerful new tools created quite a stir and made computer code far more elegant and interesting. The three primary concepts are: encapsulation, inheritance, and polymorphism. All three fall under the umbrella term "abstraction" since they all give us new ways to represent abstract ideas (or objects) such as bank accounts, transactions, and other such things managed by computers, using computer code. This means the code is structured in a way that more accurately represents these objects, and therefore, more accurately handles and manages them.
Encapsulation is the idea of the black box. Think of a car engine, for instance. Many people haven't the foggiest notion of how a combustion engine works (perhaps a better example is how an airplane stays up in the air, since even fewer seem to understand that secret). However, that isn't a very big problem, unless of course, your car breaks down (or the airplane crashes). As you drive the car, it doesn't (and shouldn't) concern you what happens when you press the gas pedal, or the brakes. When you turn the steering wheel, you don't care how the engine responds and decides which way to turn the car. It doesn't matter to you, because it's a black box system. As long as it does its job, the black box can remain a mystery, and there is absolutely no problem with that.
We can do precisely this same thing with computer software. We can now write a portion of code that can interact in a well-defined way (known as an API) with other code. We can bundle up the code we wrote, sell it to someone else, and then they can write code on top of it that turns the steering wheel and pushes the pedals, so to speak. They don't care how our code works; it just works. When they turn the steering wheel, the car turns, and when they push the gas pedal, the car moves forward.
Encapsulation is accomplished in the software world by defining the scope of program elements. The scope tells us where in the program (and outside) we can see those elements. Functions, as mentioned earlier, are one such element. Stored data is the other primary element. We can define something as public (viewable by everyone, everywhere) or private (viewable only within the black box). This allows us to share the information we want, and protect the information that shouldn't be shared within the black box.
Inheritance is a lot simpler; you are already familiar with one form of it - genetics. In programming, inheritance works exactly the same way. We can write generic code that acts like an Animal - it has behaviors (defined by functions) such as speak, play, sleep, and so on. Then, we can write more specific code that acts like a Dog, but inherits the more generic aspects that it shares with all Animals. All animals can speak, but when a Dog speaks, the behavior can be defined specifically as "Bark." We could then write a Cat which inherits this same behavior (speaking) but again, when we invoke the Cat's 'speak' function, instead we receive a "Meow" in response.
Finally, polymorphism is the most complex of the three. It's quite a difficult concept to wrap your mind around, even if you're a programmer. However, the simplest way to explain it was already done in the last paragraph. It is closely related to inheritance. When a Cat speaks and we hear a "Meow" then a Dog speaks and we hear a "Bark," this is an example of polymorphism. In either case, we are simply invoking the inherited "speak" function - but the behavior is different depending on the subclass (Cat or Dog). This is polymorphism - the ability to define a specific response to a generic behavior.
In essence, these abstractions give us two dimensions in which to program. With structured design, a function always does the same thing every time you call it. With object-oriented design, polymorphism gives us a different response based on both the function/behavior and the object/idea. Invoking the same function on a different object normally produces different results.
Now, prepare your mind for some extreme warping - we are now in the age of subject-oriented programming, where we can wield three such dimensions. The result or response we get from a function can now be defined by the name of the function, the object on which it is invoked, and the subject related to the invocation. For instance, my Dog might have a different bark depending on whether he was happy to see me, or whether he was trying to fend off an intruder. This constitutes the subject, or aspect, in which the behavior is invoked.
Aspect-oriented programming is very similar to subject-oriented, but to spare your mind further warpage, I won't go into any detail on the differences between the two. Instead, I will just say that programming has come a long way from using hexadecimal codes and command line interfaces. We now have the power to determine the software's behavior based on the desired function, the object performing the action, and the context in which the interaction occurs. This gives incredible potential even just for artificial intelligence. Computer code that can update and edit itself is now just around the corner.
And yet, DNA has been doing exactly that for thousands of years. Is that something that occurred by random chance? I think it's about as likely as computers assembling and programming themselves out of thin air. It takes a mind more intelligent than a computer to design and build a computer. By the same token, it takes a mind more intelligent than a human mind to create such an incomprehensibly vast universe, and to populate it with beings whose bodies themselves contain technology far more advanced than computers; least of all, the human mind itself.
Tuesday, June 26, 2012
Piracy Is Wrong, Period
The following was actually meant for a previous post (the one on being regarded as a computer genius) – however, I rambled so much in my initial write-up that I simply couldn't leave this all in the same post. It had become far too big, and a post all its own. As you may know, that other post was already long enough! This one is about media sharing and piracy. Touchy topic, I know. There's also some smaller bits about CD visors and generosity.
To get things rolling, I would like to point out that music, movies, and, in general, things you think shouldn't be free, are never actually free. Sure, you can get them for free online - illegally. If your local computer genius promises you a free copy of Photoshop, for instance, be very skeptical and ensure that they are not doing something that is against the law. I don't care what edition it is; enterprise, professional, teacher, student, or hobo - if it's free, there is a 90% chance of something illegal going on in obtaining it. I'm a computer genius - I should know, right?
The worst part that only adds insult to injury is that most of the sites claiming to offer such “free” goods are the worst sites to visit, if purely from a safety standpoint. They are often bloated with malware, and links to gambling and porn sites galore. It’s simply a nightmare, and it’s not made any better by the fact that the people who, often unknowingly, access these sites have no virus protection on their computer. They end up getting something for nothing, certainly - a free computer meltdown.
Just because the RIAA, or the standards bureau for whatever industry is involved, can't prosecute 99% of the cases of media piracy (or unauthorized reproduction and distribution, whatever you want to call it) that occur online is no reason to condone it. I certainly don't, and you shouldn't, either. Don't claim that the distribution system is flawed - come up with one that isn't. Don't claim ignorance - I've now rid you of that excuse. Just pay up.
Getting something for nothing is a ridiculous idea that is taught by a society addicted to gambling, massive debt (or rather, living beyond one's means) and other such vices. Don't be taken in by the lie. Anything worth having is worth earning or buying. Getting it for free may feel great now, but it destroys your character. Free things are temporary and will only decay and wear out with use. Character is powerful and eternal. It's easy to see which matters most.
Burning CD's and DVD's is illegal too. That's not necessarily piracy - piracy in the strictest sense is copying and then selling something you don't have explicit permission to copy (much less sell). However, unauthorized copying and distributing is just as illegal as piracy. That, and the fact that "piracy" is a lot less of a mouthful than "unauthorized reproduction and distribution" leads to the popular belief that the two are the same thing. It doesn't really matter though - one is just as illegal as the other.
Now, with CD's I admit there is certainly a valid reason why so much of it goes on, perhaps more so than in the other markets. Notice, I didn't say there is a valid reason why it is legitimate. It isn't legitimate in any sense. However, many people seem to think it's okay because it helps generate interest and that any "woe" over lost sales is ludicrous compared to the buzz factor gained. I won't get into that debate; but again, I must draw the line clear and simple, black and white: it's illegal. Does that word even mean anything these days? If it means something to you, or if you have any desire to consider yourself a decent citizen, don't do it.
Technically, copying (a.k.a. 'ripping') any copyrighted CD to your computer (yes, even one you paid for and own) is an unauthorized copy; so is burning a new CD so you can have one to play in your car. I feel this is going a bit far, myself. If I paid money and legitimately own my own copy of the media, have no intention to make it available to anyone else, and have the means and the know-how to prevent that from ever happening*, I don't see what the big deal is. It's the same music, I own it, and it's up to me how I decide to use it, as long as I'm the only one doing so.
*All you really need to do to prevent this media being taken without your consent is to always lock your car - if you leave CD's in your car they are likely to be fried anyway. I recently bought a CD visor for my car, and it has a warning label that says not to use it in any "closed cars" - I'm not joking! Am I mistaken, or would that be referring to all non-convertibles? And, if a convertible is not closed - that is, if the top is down - where would one put the visor?
However, though I do think that prohibiting personal CD burning is going a bit far, it's still the law. I would hardly be justified in this anti-piracy rant if I myself was guilty of it. In the past, I certainly was, no denying that – but it’s something I have been working hard to set right. My current copy of Photoshop is legitimate, for instance – something I couldn’t have claimed even two years ago. Do I think Photoshop is worth $700 dollars? Absolutely not! I settled for Photoshop CS2 on eBay and saved five hundred bucks. Oddly enough, I like my legitimate version better – and not just because my pirated copy was the much-older Photoshop 7 (though mostly for that reason).
The same can be said about the Producer’s Edition of FL Studio (Fruity Loops, a music producer’s flagship to you non-nerds). Unlike Photoshop, I believe it was worth every penny, and that’s why I paid for it. I have no interest in obtaining such a great piece of software, which no doubt cost many other software developers like myself untold hours of labor and effort, for free. That labor and effort is worth something to me – namely, precisely the amount I paid for it.
That’s the very idea of a market. You render services and obtain goods, or you render goods and obtain goods, or some other such interaction, where the value of the items exchanged is estimated to be the same by both parties. You never render nothing and get something; the very idea is absurd. How many things on which you would place a high personal value have you given away for free in the last year? Case in point.
Giving for the sake of giving is the ONLY way anyone ever gets anything for free, and by virtue of the idea, YOU are never on the receiving end. People give to the less fortunate because they're just that - needy. Maybe they ask for help, maybe they don't. If you have a roof over your head, running water, air conditioning, and even mediocre health, you are more blessed than probably 90% of the world's population.
It’s the problem of give versus get. Getting something for free is irrational, while giving something for free is quite rational, and the mark of any stable society - a virtue which is quite rare these days known as generosity. The modern idea of generosity seems limited to a certain time of year, and is far less affordable that way.
Piracy is not wrong just because making an unauthorized copy of a CD or DVD financially or even mentally hurts someone else, even indirectly. I ran across an article recently that seems to indicate the very opposite - that piracy actually helps music sales, of all things. No, piracy is wrong because the law says it’s wrong. Piracy is wrong because the United States has a moral code that everyone is subject to uphold - namely, the Constitution.
Constitutional law used to be absolute, meaning there was no reasoning around it or changing it on a whim, or because “times have changed.” These days the term 'obsolete' has replaced 'absolute.' But back when the U.S. was founded, the law was the law, like it or not. If you don’t live by it, don’t be surprised when no one else does, either. Know what that’s called? Anarchy. Set an example and start developing some character. Rid your life of the plague of piracy and the lie that you can get something for nothing. Start giving something for nothing and see where that gets you – see where that gets our society.
To get things rolling, I would like to point out that music, movies, and, in general, things you think shouldn't be free, are never actually free. Sure, you can get them for free online - illegally. If your local computer genius promises you a free copy of Photoshop, for instance, be very skeptical and ensure that they are not doing something that is against the law. I don't care what edition it is; enterprise, professional, teacher, student, or hobo - if it's free, there is a 90% chance of something illegal going on in obtaining it. I'm a computer genius - I should know, right?
The worst part that only adds insult to injury is that most of the sites claiming to offer such “free” goods are the worst sites to visit, if purely from a safety standpoint. They are often bloated with malware, and links to gambling and porn sites galore. It’s simply a nightmare, and it’s not made any better by the fact that the people who, often unknowingly, access these sites have no virus protection on their computer. They end up getting something for nothing, certainly - a free computer meltdown.
Just because the RIAA, or the standards bureau for whatever industry is involved, can't prosecute 99% of the cases of media piracy (or unauthorized reproduction and distribution, whatever you want to call it) that occur online is no reason to condone it. I certainly don't, and you shouldn't, either. Don't claim that the distribution system is flawed - come up with one that isn't. Don't claim ignorance - I've now rid you of that excuse. Just pay up.
Getting something for nothing is a ridiculous idea that is taught by a society addicted to gambling, massive debt (or rather, living beyond one's means) and other such vices. Don't be taken in by the lie. Anything worth having is worth earning or buying. Getting it for free may feel great now, but it destroys your character. Free things are temporary and will only decay and wear out with use. Character is powerful and eternal. It's easy to see which matters most.
Burning CD's and DVD's is illegal too. That's not necessarily piracy - piracy in the strictest sense is copying and then selling something you don't have explicit permission to copy (much less sell). However, unauthorized copying and distributing is just as illegal as piracy. That, and the fact that "piracy" is a lot less of a mouthful than "unauthorized reproduction and distribution" leads to the popular belief that the two are the same thing. It doesn't really matter though - one is just as illegal as the other.
Now, with CD's I admit there is certainly a valid reason why so much of it goes on, perhaps more so than in the other markets. Notice, I didn't say there is a valid reason why it is legitimate. It isn't legitimate in any sense. However, many people seem to think it's okay because it helps generate interest and that any "woe" over lost sales is ludicrous compared to the buzz factor gained. I won't get into that debate; but again, I must draw the line clear and simple, black and white: it's illegal. Does that word even mean anything these days? If it means something to you, or if you have any desire to consider yourself a decent citizen, don't do it.
Technically, copying (a.k.a. 'ripping') any copyrighted CD to your computer (yes, even one you paid for and own) is an unauthorized copy; so is burning a new CD so you can have one to play in your car. I feel this is going a bit far, myself. If I paid money and legitimately own my own copy of the media, have no intention to make it available to anyone else, and have the means and the know-how to prevent that from ever happening*, I don't see what the big deal is. It's the same music, I own it, and it's up to me how I decide to use it, as long as I'm the only one doing so.
*All you really need to do to prevent this media being taken without your consent is to always lock your car - if you leave CD's in your car they are likely to be fried anyway. I recently bought a CD visor for my car, and it has a warning label that says not to use it in any "closed cars" - I'm not joking! Am I mistaken, or would that be referring to all non-convertibles? And, if a convertible is not closed - that is, if the top is down - where would one put the visor?
However, though I do think that prohibiting personal CD burning is going a bit far, it's still the law. I would hardly be justified in this anti-piracy rant if I myself was guilty of it. In the past, I certainly was, no denying that – but it’s something I have been working hard to set right. My current copy of Photoshop is legitimate, for instance – something I couldn’t have claimed even two years ago. Do I think Photoshop is worth $700 dollars? Absolutely not! I settled for Photoshop CS2 on eBay and saved five hundred bucks. Oddly enough, I like my legitimate version better – and not just because my pirated copy was the much-older Photoshop 7 (though mostly for that reason).
The same can be said about the Producer’s Edition of FL Studio (Fruity Loops, a music producer’s flagship to you non-nerds). Unlike Photoshop, I believe it was worth every penny, and that’s why I paid for it. I have no interest in obtaining such a great piece of software, which no doubt cost many other software developers like myself untold hours of labor and effort, for free. That labor and effort is worth something to me – namely, precisely the amount I paid for it.
That’s the very idea of a market. You render services and obtain goods, or you render goods and obtain goods, or some other such interaction, where the value of the items exchanged is estimated to be the same by both parties. You never render nothing and get something; the very idea is absurd. How many things on which you would place a high personal value have you given away for free in the last year? Case in point.
Giving for the sake of giving is the ONLY way anyone ever gets anything for free, and by virtue of the idea, YOU are never on the receiving end. People give to the less fortunate because they're just that - needy. Maybe they ask for help, maybe they don't. If you have a roof over your head, running water, air conditioning, and even mediocre health, you are more blessed than probably 90% of the world's population.
It’s the problem of give versus get. Getting something for free is irrational, while giving something for free is quite rational, and the mark of any stable society - a virtue which is quite rare these days known as generosity. The modern idea of generosity seems limited to a certain time of year, and is far less affordable that way.
Piracy is not wrong just because making an unauthorized copy of a CD or DVD financially or even mentally hurts someone else, even indirectly. I ran across an article recently that seems to indicate the very opposite - that piracy actually helps music sales, of all things. No, piracy is wrong because the law says it’s wrong. Piracy is wrong because the United States has a moral code that everyone is subject to uphold - namely, the Constitution.
Constitutional law used to be absolute, meaning there was no reasoning around it or changing it on a whim, or because “times have changed.” These days the term 'obsolete' has replaced 'absolute.' But back when the U.S. was founded, the law was the law, like it or not. If you don’t live by it, don’t be surprised when no one else does, either. Know what that’s called? Anarchy. Set an example and start developing some character. Rid your life of the plague of piracy and the lie that you can get something for nothing. Start giving something for nothing and see where that gets you – see where that gets our society.
Labels:
CD,
copying,
copyright,
DVD,
free download,
illegal,
movies,
music,
piracy,
unauthorized
Tuesday, June 19, 2012
Entertaining Ideas
No, this post is not about ideas that one might find entertaining. Some of these ideas may be entertaining, (by accident, mind you!) but that's not the point.
The point is this:
"It is the mark of an educated mind to entertain an idea without accepting it."
-Aristotle
I simply love this quote. People would get along much better on the whole if they were, according to Aristotle, educated, or at least had this mark indicating such, by acting accordingly.
Why? It seems to me that the primary reason most arguments begin (other reasons notwithstanding) is that neither person can entertain (is willing to consider) the idea that the other person is trying to get across, or the idea that the other person might be correct. Another idea many who commonly get into arguments cannot seem to entertain is the idea that more than one person can be right. I admit personally that these are two very difficult ideas to entertain! You have to dance a jig and tell a joke, and even then it's a tough audience.
I have noticed (in hindsight) that in roughly 80% of the arguments I have been in, both participants were correct, and neither was willing to admit this possibility, resulting in only frustration and anger, and prolonging the argument.
This is an important concept to wrap your mind around (to entertain, if you will) and one that I believe even goes beyond the Aristotle quote above. Because it was stated so concisely and eloquently, I am including a portion here from an article in 2600 magazine (The Hacker Quarterly) that explains this very subject, in terms of the stigma and confusion surrounding the hacker group Anonymous:
"Because we have a culture where there are good guys and bad guys, we demand that those labels be used, and that people be lumped into either one or the other, preferably those who agree with us and those who don't. The problem is that when we do that without understanding why it doesn't actually work that way, we unfairly prosecute people who were doing the "right" thing, and wind up having to deal with people who have been mislabelled. [...] [Y]ou can't really destroy an idea unless you consider it. The problem is, once you open your mind and consider it, you may no longer disagree with it.
And that is the bottom line which creates and perpetuates both the fear and the paranoia [about Anonymous]: a sense that we might just be wrong. When you only ascribe to the "good" things with which you agree, you leave no place for learning from your mistakes. Thus, when we discover we have made mistakes, rather than being honest, meeting sympathetic eyes, and moving on, we must run and hide, begging forgiveness, or morph the mistakes into shell statements of what they actually were, devoid of any meaning, and shedding any potential lesson we could have learned. With this pattern, we learn to brush things we don't understand under the table, hoping they will go away and leave us alone." [1]
The drama about Anonymous aside, the author's point is a great one. When fully understood, it is quite profound, and opens your mind to a new way of thinking - a way of entertaining ideas, as Aristotle put it, without accepting them. If you can truly do this, you have not only proven yourself to have the mark of an educated mind - you've made a big step down the never-ending but enlightening road of education outside the classroom.
[1] aestetix, "Who Is Anonymous?" 2600 Magazine. Spring 2012.
The point is this:
"It is the mark of an educated mind to entertain an idea without accepting it."
-Aristotle
I simply love this quote. People would get along much better on the whole if they were, according to Aristotle, educated, or at least had this mark indicating such, by acting accordingly.
Why? It seems to me that the primary reason most arguments begin (other reasons notwithstanding) is that neither person can entertain (is willing to consider) the idea that the other person is trying to get across, or the idea that the other person might be correct. Another idea many who commonly get into arguments cannot seem to entertain is the idea that more than one person can be right. I admit personally that these are two very difficult ideas to entertain! You have to dance a jig and tell a joke, and even then it's a tough audience.
I have noticed (in hindsight) that in roughly 80% of the arguments I have been in, both participants were correct, and neither was willing to admit this possibility, resulting in only frustration and anger, and prolonging the argument.
This is an important concept to wrap your mind around (to entertain, if you will) and one that I believe even goes beyond the Aristotle quote above. Because it was stated so concisely and eloquently, I am including a portion here from an article in 2600 magazine (The Hacker Quarterly) that explains this very subject, in terms of the stigma and confusion surrounding the hacker group Anonymous:
"Because we have a culture where there are good guys and bad guys, we demand that those labels be used, and that people be lumped into either one or the other, preferably those who agree with us and those who don't. The problem is that when we do that without understanding why it doesn't actually work that way, we unfairly prosecute people who were doing the "right" thing, and wind up having to deal with people who have been mislabelled. [...] [Y]ou can't really destroy an idea unless you consider it. The problem is, once you open your mind and consider it, you may no longer disagree with it.
And that is the bottom line which creates and perpetuates both the fear and the paranoia [about Anonymous]: a sense that we might just be wrong. When you only ascribe to the "good" things with which you agree, you leave no place for learning from your mistakes. Thus, when we discover we have made mistakes, rather than being honest, meeting sympathetic eyes, and moving on, we must run and hide, begging forgiveness, or morph the mistakes into shell statements of what they actually were, devoid of any meaning, and shedding any potential lesson we could have learned. With this pattern, we learn to brush things we don't understand under the table, hoping they will go away and leave us alone." [1]
The drama about Anonymous aside, the author's point is a great one. When fully understood, it is quite profound, and opens your mind to a new way of thinking - a way of entertaining ideas, as Aristotle put it, without accepting them. If you can truly do this, you have not only proven yourself to have the mark of an educated mind - you've made a big step down the never-ending but enlightening road of education outside the classroom.
[1] aestetix, "Who Is Anonymous?" 2600 Magazine. Spring 2012.
Tuesday, June 12, 2012
On Being A "Computer Genius"
Whenever I meet anyone, the moment that person becomes aware that I might know a little something about computers, they come to regard me instantly and irrevocably as a "computer genius." I've always been bemused by this, and perhaps a little bit frustrated. I would therefore like to elaborate (and quite elaborately) on this subject, particularly for those who might think they know such a "computer genius" - and show you what is wrong with this concept. It's not that complicated, I promise. In fact, this diagram pretty well explains it, for the most part. (Slight warning: there is quite a lot of vulgar material on XKCD, so I can't recommend browsing into it very far. Do so at your own risk. There is none on the particular comic found at the link above, however.)
Personal Disclaimer
Before I get into the meat of this post, I will say that this is not written with any bitterness or anger towards people that regard me as a computer genius. I will happily devote my time to helping you solve your computer troubles; this goes for anyone. If you know me, you know this is true. I have no reservations about making technology work better for people, and any way I can serve others is a great benefit to me personally on many different levels. Again, this post is written only to educate and inform, not to spread discontent in any way. I would like nothing better than for you to continue regarding me as a computer genius (even after having read this post) if you are so inclined. It won't hurt my feelings one bit - though it might inflate my ego a tad, something I could definitely do without. I'll leave your reaction up to you. Just don't come away from this post more hesitant than before to ask me for computer help - that's not my purpose in writing this at all. My computer knowledge is almost entirely useless if I can't use it for the good of others.
Overview
First I will discuss the most common misconceptions about "computer geniuses," then I will move on to refute the false idea of computer illiteracy. Third, I'll explain why a true computer genius does not exist, and finally, I will show that even if computer geniuses did exist, neither I, nor most nerds, would qualify as one. Believe it or not, I'll do this all without using any technical terms or saying anything that might go over your head. Yes, I admit, that will take some effort (not using any technical terms). Well, okay, there might be one or two scattered around, but you won't need to understand them to get what I'm saying.
You may have noticed that this post is quite large. It is therefore broken into sub-sections to make both reading and browsing somewhat easier.
What Is A "Computer Genius"?
A computer genius, as defined by just about everyone who regards me as one, is a person who can call up any bit of knowledge whatsoever about a computer at any given moment. They can fix any computer problem, provide advice for any situation that even remotely involves a computer, and are always the best candidate to look at your computer and make it go faster and work better, in every way possible. They know all the latest facts about any and every computer model, operating system, hardware, and software, and can provide you near-free* access to any of it given a moment's notice, and a flash drive.
*Here I would like to note (I may write a future post just for this topic) that music, movies, and, in general, things you think shouldn't be free, are never actually free. Sure, you can get them for free online - illegally. If your local computer genius promises you a free copy of Photoshop, for instance, be very skeptical and ensure that they are not doing something that is against the law. I don't care what edition it is; enterprise, professional, teacher, student, or hobo - if it's free there is a 90% chance of something illegal going on in obtaining it. I'm a computer genius - I should know, right? (Though if there were any 'hobo editions' I imagine they'd be free!)
I admit that to someone who considers him or herself computer illiterate, us nerds may actually seem to possess all this knowledge and such in regards to computers, and that's totally understandable. In fact, that's the myth this post is here to dispel. Not that you should stop regarding me/us as computer geniuses, but I'll be the first to say we certainly don't deserve to be put on a pedestal. In fact, most of the time when I have helped someone do just about anything on a computer, I don't really feel I've done anything extraordinary at all. Most of the computer things I end up helping others with are things I routinely do without the slightest notion that someone else might not know how to do it.
Guess What? You're Not Computer Illiterate
By the way, a 'computer illiterate person' is a figment of your imagination. I'm sorry, but that's just a lame excuse not to learn anything more about computers, or put forth any effort in doing so. As stated in my disclaimer above, I'm totally fine with bearing all your technological burdens. That being said, is it too much to ask that you put forth at least as much effort in learning about computers as you do with anything else?
Computer technology is really not all that difficult to learn, particularly in this age where you don't have to know the commands to run programs, for instance. A program is now just a little icon that you click on. It's been made as simple as possible so anyone can use it - the least you can do is have some self esteem and stop calling yourself illiterate. If you can read, you can use a computer. Reading takes far more mental acuity than using a computer - though I admit that I learned to use a computer at an age remarkably close to the age at which I learned to read. If you can't say the same, I'll give you a little bit of philosophical slack there.
If you're over 50, you get a lot more slack due to growing up before the age of what we know as modern computer technology. This is because the human mind generally loses its ability to quickly and readily absorb new information at about age 26. This puts the pioneering age of personal computers, or at least commonplace usage of them, well past your years of optimal computer-learnage, to turn a phrase.
Still, the "you can learn anything" principle always applies. Ask anyone who you would consider an expert at anything - besides computers. Think of the one person you know who is best at a given skill or talent. Collectively, these people would tell you that when they began doing it, they were terrible at it and thought they would never get any better. Had the subject in question been computers, they might have even labelled themselves as computer illiterate. The key is that they didn't let that stop them for a second.
How To Approach "Computer Geniuses"
Calling yourself computer illiterate in front of a computer genius means basically nothing, for starters. Nearly everyone that person meets is by comparison computer illiterate, except perhaps people working in the computer industry. But think a little more deeply about what you are really trying to tell the person - wouldn't "I need help, can you teach me?" come across as a slightly more inviting attitude? When you've basically said "you'll have to do everything for me, and I won't even try to pay attention," an awkward smile just won't cut it (though smiles are generally never frowned upon, at least not in my book).
If you can't be sincere, at least act like you want to learn. I can't imagine that anyone would enjoy having to ask how to do every single tiny thing on a computer, particularly asking more than once about the same exact thing. So do you, on the flip side, imagine we like teaching these same basic things over and over, to people who refuse to put forth any effort in trying to learn?
Don't just make the motions - follow through. Bring a pencil and paper so you can write down instructions, if necessary. Don't force your local computer genius to write it down for you. Not only is their writing seldom self-explanatory (or even legible), but it would be like asking a scientist to help you do advanced science by writing down the instructions using basic formulas, ones they work with every day. Most scientists, if you were to do that, would simply hand you a copy of their calculus textbook. With computers, there is no such manual, or at least one with universal formulas like mathematics has.
Anyone who has been through high school should be competent enough at math to at least understand an equation, if not solve one. Computers, on the other hand, are all about how to use the mouse, and how to interact with the buttons and other elements displayed on the screen. To a lesser extent, it also helps to have a basic understanding of files and how data is organized and handled by the computer. Nearly every computer-usage instruction would begin with "move the mouse" and include several "click once here" or "click twice there" phrases. To be blunt, using a computer is far easier than math. Now programming computers, on the other hand...
As an aside, did you know that the first mice (for mainframe computers) were so large that you had to sit down on them and drive around, like a golf cart with a tail?
Just kidding! But the very idea sure is amusing!
There Are No Computer Geniuses
I'm going to assume that by now, you've realized you might actually be able to learn how to use a computer, and begun to move out of your self-imposed "computer illiterate" shadow. Great work so far! You're roughly halfway to learning something about computers. The next step is to realize that, no matter how much computer knowledge is possessed by any person you know, there will always be something that person simply can't help you with, no matter how much they would like to. If that person happens to be me, I will happily utilize my well-developed Googling skills* and at least help you try to find an answer somewhere, even if I don't have it.
*Yes, Googling is actually a skill. It seems to me that if there's something you're interested in finding, it is most likely "out there somewhere" in the world wide web. If it really is out there, chances are that I can find it. I figure it's a skill because 99% of the time, I found it, and the person who asked me to find it did a considerable amount of searching for it themselves. Here's a hint: the key words are the key! Even changing a single word in your search can bring up drastically different search results. For instance, "iPad keyboard case" versus "iPad keyboard cover."
The point is, a true computer genius simply does not exist, because it's a stereotype. Computers are a vast collection of highly advanced digital circuitry that no single person in the entire world knows everything about. Just understanding how they work, from beginning to end, is a monumental goal in and of itself, which very few people, even in the computer world, have ever achieved. Computers (to be more precise, personal computers) represent decades and even centuries of continuous research and development. Would you really expect a single person to know, for instance, everything there is to know about any other field? Artistry? Music? Food? Architecture? Then how is it reasonable to impose this same stereotype on someone who, at best, actually happens to be the world's leading authority on some tiny portion of computer technology, and at worst, knows a bit more about computers than you do? Odds are it's the latter, and in most cases, you've barely even met this person!
The answer is, it's no more reasonable than any other stereotype - quite unreasonable, really. Expecting their help simply because they are the "local computer genius" - and because you haven't invested any time or energy into your own knowledge of computers - is even more unreasonable. When you want to know something about flowers, do you drive down to the local flower shop and lampoon the cashier with questions you already know she can't answer, calling yourself "flower illiterate" - or do you head to Wal-Mart, buy a gardening magazine and some seeds, then head home and grab your hoe and shovel? Why should your approach be any different with computers?
I can't blame you if some hardware breaks and you want someone else to install it. Hardware is a somewhat different story, since that is far more specialized knowledge. However, most non-hardware issues can be solved by moving the mouse to the right spot and clicking. This involves no digging of holes or planting of flowers, and certainly nothing anywhere near advanced as basic math or even reading and grammar - so why harbor the ridiculous stigma that "I will never be able to understand or use computers?" In 90% of computer problems, simply Googling your problem and then following the first set of instructions you find will get you halfway to solving the problem, and by then it wouldn't be rocket science to figure out the rest.
What We 'Computer Geniuses' DO Know
Not to sound like I'm working backwards, but now that I've covered all that - many computer geniuses may actually know quite a bit about computers! Some more than others. This shouldn't surprise you, but in light of everything I've said, it's still something to consider. It becomes pertinent, then, to understand more about the different categories of computer knowledge (again, as with any other field) so that you can begin to pinpoint the areas you need expertise in, and therefore, target those individuals who will actually be able to help you. Why do I say this? Odds are if you consider someone a computer genius, he or she is quite likely to try and help you regardless of your request, even despite a complete lack of knowledge in that area. It's true! I do it all the time.
Is it just to keep up the 'computer genius' image? Is it out of sheer helpfulness? I can't say for sure, but it would definitely make things less confusing if more people read this post! Even just guessing the area your problem falls under might save your computer genius the effort of figuring that part out. Even if you guess wrong, you will have impressed them just by trying to apply yourself. For instance, instead of saying "I'm computer illiterate," say something like "I think I'm having a network problem, can you help me?" By contrast, when I hear someone tell me "I'm computer illiterate" by implication they are expecting me to know everything and fix everything with little to no involvement from them.
Now, I certainly wouldn't offer to help anyone else if I didn't think I knew something about computers. In fact, I've studied into computers quite a bit just for the sake of learning them - what I know about computers is not all 'intuitive' and it did not all enter my head the first time I laid hands on a keyboard and mouse. It doesn't always come easy, either. Still, one might wonder, why don't I consider myself a computer genius?
Let me answer this question in a somewhat odd fashion. I will attempt to categorize all computer knowledge. This should open your mind somewhat to just how vast is the world of personal computers. Then I will estimate my level of knowledge for each category.
In my experience, here are the basic areas of computer expertise, and a brief description of what exactly each one is: software, the available programs, or lists of instructions (computer code) that tell computers what to do; hardware, the physical components that make up a computer; networking, the wires, cables, airwaves, satellite transmissions, and other such things that computers use to communicate with each other, as well as the involved hardware, software, and communication protocols; digital electronics, the super-tiny components that make up the hardware; programming, the art of writing, testing, and distributing software (computer instructions) that perform meaningful tasks; web design, the art of creating websites; server architecture, the computers that run web sites and the internet; and security, keeping data secure as it travels over the internet, and keeping computers from being infected with viruses (malicious programs). Here's the kicker - this is so high-level we have barely scratched the surface. Each of these areas is in itself an entire field, about which no one person could possibly know everything. Are you starting to get the picture?
Now, I will be using a percentage scale: 100% representing all available and possible knowledge about any one particular category, and 0% representing absolutely no knowledge in that area. Here is a breakdown of my own estimation of my computer knowledge and skill:
Software: 36%
Hardware: 3%
Networking: 4%
Digital Electronics: 7%
Programming: 22% (I am a software developer, mind you!)
Web Design: 11%
Server Architecture: 1%
Security: 18%
I think that should pretty well answer the question. My extreme lack of computer knowledge speaks for itself! I would define a true 'computer genius' as someone who has even 25% to 50% in each of these categories.
Don't assume by these low numbers that I am trying to be modest. This is the most accurate data that I could come up with. Think about this for a moment: of all the programming knowledge, even with me being a programmer - I believe I know less than a fourth of all knowledge about programming. That's in just one category, and my second highest!
What should really help drive the point home is when you consider the average computer problem. Here is another breakdown estimating what percentage of all computer problems (adding to 100%) occur in each category:
Software: 22% (not bad programming, just installation issues)
Hardware: 24%
Networking: 14%
Digital Electronics: 1%
Programming: 31% (this is basically the cause of most software problems)
Web Design: 5%
Server Architecture: 1%
Security: 2%
Since the average computer problem can be solved with a slightly-above-average collection of knowledge about software, hardware, and programming, something any nerd often pegged as a computer genius is quite likely to have, this explains how we appear to have "all knowledge." Networking generally takes care of the rest; however that ends up being the most common unsolved problem category, since very few nerds actually know enough about networking to solve most such problems. Typically problems finding their way to computer geniuses from outside sources range from simple in difficulty to about medium; however, as was just stated, the network category is a bit different. Since this level of knowledge falls at a much lower level of the overall computer architecture, the configuration and often mathematics involved is usually too difficult for your average computer genius to solve, myself included. In my case, I have less of an excuse for this than you do for computer knowledge in general - I graduated college a double major in computer science and math. For me, the gap is simply a lack of detailed study (or more precisely, interest) in the networking field.
'Computer Geniuses' Have Problems With Computers Too
Finally I would like to briefly refute the myth that computer geniuses never have computer problems of their own (since, obviously, they already know how to solve them all.) This is not only quite untrue, but the opposite is actually the case - computer geniuses, or those regarded as such, typically have far more problems, and more difficult ones at that, than people that consider themselves 'computer illiterate.' Why does this occur? It isn't just because computer geniuses can usually solve the easier problems on their own - it's primarily because they do more advanced things on a regular basis and change settings a lot more frequently. As such, they are far more likely to break something in these advanced settings.
As an example, one of my first experiences with a computer was in the Windows Registry. I won't go into any detail here about what that is - just know that it's an important part of Windows, the primary collection of software (called the OS or operating system) that gives you all those little icons, buttons, windows, scroll bars, and other nice things to click on. Well, I was messing around in there without understanding what this Registry was for or how it worked. I began changing some things and found that now my programs would not run. I really thought I'd broken that laptop for good! However, I played around with it some more and was luckily able to get it back to normal. I admit it did require some deductive skill and intuition about what was going on with this Registry program, but it still taught me what it does and why it is there. This is a problem 70% of computer users would never have, because 70% of computer users don't have the foggiest idea that the Windows Registry even exists, much less any desire to open it or start changing things.
Whew! This will likely be my longest post for quite a while - maybe the entire blog. I guess I had a lot to say about this. It's certainly plenty of food for thought. I'll leave you with one major tip about solving nearly ANY computer problem. If all else fails...throw it out the window! Forget versions, OS levels, and actually having to know anything about computers to begin with. This solution works! And it gives you a great excuse to go buy a new computer. Although, to avoid any legal controversies and/or murder charges, you may want to ensure there is no one walking around on the sidewalk below said window before you go and chunk a computer out of it.
Personal Disclaimer
Before I get into the meat of this post, I will say that this is not written with any bitterness or anger towards people that regard me as a computer genius. I will happily devote my time to helping you solve your computer troubles; this goes for anyone. If you know me, you know this is true. I have no reservations about making technology work better for people, and any way I can serve others is a great benefit to me personally on many different levels. Again, this post is written only to educate and inform, not to spread discontent in any way. I would like nothing better than for you to continue regarding me as a computer genius (even after having read this post) if you are so inclined. It won't hurt my feelings one bit - though it might inflate my ego a tad, something I could definitely do without. I'll leave your reaction up to you. Just don't come away from this post more hesitant than before to ask me for computer help - that's not my purpose in writing this at all. My computer knowledge is almost entirely useless if I can't use it for the good of others.
Overview
First I will discuss the most common misconceptions about "computer geniuses," then I will move on to refute the false idea of computer illiteracy. Third, I'll explain why a true computer genius does not exist, and finally, I will show that even if computer geniuses did exist, neither I, nor most nerds, would qualify as one. Believe it or not, I'll do this all without using any technical terms or saying anything that might go over your head. Yes, I admit, that will take some effort (not using any technical terms). Well, okay, there might be one or two scattered around, but you won't need to understand them to get what I'm saying.
You may have noticed that this post is quite large. It is therefore broken into sub-sections to make both reading and browsing somewhat easier.
What Is A "Computer Genius"?
A computer genius, as defined by just about everyone who regards me as one, is a person who can call up any bit of knowledge whatsoever about a computer at any given moment. They can fix any computer problem, provide advice for any situation that even remotely involves a computer, and are always the best candidate to look at your computer and make it go faster and work better, in every way possible. They know all the latest facts about any and every computer model, operating system, hardware, and software, and can provide you near-free* access to any of it given a moment's notice, and a flash drive.
*Here I would like to note (I may write a future post just for this topic) that music, movies, and, in general, things you think shouldn't be free, are never actually free. Sure, you can get them for free online - illegally. If your local computer genius promises you a free copy of Photoshop, for instance, be very skeptical and ensure that they are not doing something that is against the law. I don't care what edition it is; enterprise, professional, teacher, student, or hobo - if it's free there is a 90% chance of something illegal going on in obtaining it. I'm a computer genius - I should know, right? (Though if there were any 'hobo editions' I imagine they'd be free!)
I admit that to someone who considers him or herself computer illiterate, us nerds may actually seem to possess all this knowledge and such in regards to computers, and that's totally understandable. In fact, that's the myth this post is here to dispel. Not that you should stop regarding me/us as computer geniuses, but I'll be the first to say we certainly don't deserve to be put on a pedestal. In fact, most of the time when I have helped someone do just about anything on a computer, I don't really feel I've done anything extraordinary at all. Most of the computer things I end up helping others with are things I routinely do without the slightest notion that someone else might not know how to do it.
Guess What? You're Not Computer Illiterate
By the way, a 'computer illiterate person' is a figment of your imagination. I'm sorry, but that's just a lame excuse not to learn anything more about computers, or put forth any effort in doing so. As stated in my disclaimer above, I'm totally fine with bearing all your technological burdens. That being said, is it too much to ask that you put forth at least as much effort in learning about computers as you do with anything else?
Computer technology is really not all that difficult to learn, particularly in this age where you don't have to know the commands to run programs, for instance. A program is now just a little icon that you click on. It's been made as simple as possible so anyone can use it - the least you can do is have some self esteem and stop calling yourself illiterate. If you can read, you can use a computer. Reading takes far more mental acuity than using a computer - though I admit that I learned to use a computer at an age remarkably close to the age at which I learned to read. If you can't say the same, I'll give you a little bit of philosophical slack there.
If you're over 50, you get a lot more slack due to growing up before the age of what we know as modern computer technology. This is because the human mind generally loses its ability to quickly and readily absorb new information at about age 26. This puts the pioneering age of personal computers, or at least commonplace usage of them, well past your years of optimal computer-learnage, to turn a phrase.
Still, the "you can learn anything" principle always applies. Ask anyone who you would consider an expert at anything - besides computers. Think of the one person you know who is best at a given skill or talent. Collectively, these people would tell you that when they began doing it, they were terrible at it and thought they would never get any better. Had the subject in question been computers, they might have even labelled themselves as computer illiterate. The key is that they didn't let that stop them for a second.
How To Approach "Computer Geniuses"
Calling yourself computer illiterate in front of a computer genius means basically nothing, for starters. Nearly everyone that person meets is by comparison computer illiterate, except perhaps people working in the computer industry. But think a little more deeply about what you are really trying to tell the person - wouldn't "I need help, can you teach me?" come across as a slightly more inviting attitude? When you've basically said "you'll have to do everything for me, and I won't even try to pay attention," an awkward smile just won't cut it (though smiles are generally never frowned upon, at least not in my book).
If you can't be sincere, at least act like you want to learn. I can't imagine that anyone would enjoy having to ask how to do every single tiny thing on a computer, particularly asking more than once about the same exact thing. So do you, on the flip side, imagine we like teaching these same basic things over and over, to people who refuse to put forth any effort in trying to learn?
Don't just make the motions - follow through. Bring a pencil and paper so you can write down instructions, if necessary. Don't force your local computer genius to write it down for you. Not only is their writing seldom self-explanatory (or even legible), but it would be like asking a scientist to help you do advanced science by writing down the instructions using basic formulas, ones they work with every day. Most scientists, if you were to do that, would simply hand you a copy of their calculus textbook. With computers, there is no such manual, or at least one with universal formulas like mathematics has.
Anyone who has been through high school should be competent enough at math to at least understand an equation, if not solve one. Computers, on the other hand, are all about how to use the mouse, and how to interact with the buttons and other elements displayed on the screen. To a lesser extent, it also helps to have a basic understanding of files and how data is organized and handled by the computer. Nearly every computer-usage instruction would begin with "move the mouse" and include several "click once here" or "click twice there" phrases. To be blunt, using a computer is far easier than math. Now programming computers, on the other hand...
As an aside, did you know that the first mice (for mainframe computers) were so large that you had to sit down on them and drive around, like a golf cart with a tail?
Just kidding! But the very idea sure is amusing!
There Are No Computer Geniuses
I'm going to assume that by now, you've realized you might actually be able to learn how to use a computer, and begun to move out of your self-imposed "computer illiterate" shadow. Great work so far! You're roughly halfway to learning something about computers. The next step is to realize that, no matter how much computer knowledge is possessed by any person you know, there will always be something that person simply can't help you with, no matter how much they would like to. If that person happens to be me, I will happily utilize my well-developed Googling skills* and at least help you try to find an answer somewhere, even if I don't have it.
*Yes, Googling is actually a skill. It seems to me that if there's something you're interested in finding, it is most likely "out there somewhere" in the world wide web. If it really is out there, chances are that I can find it. I figure it's a skill because 99% of the time, I found it, and the person who asked me to find it did a considerable amount of searching for it themselves. Here's a hint: the key words are the key! Even changing a single word in your search can bring up drastically different search results. For instance, "iPad keyboard case" versus "iPad keyboard cover."
The point is, a true computer genius simply does not exist, because it's a stereotype. Computers are a vast collection of highly advanced digital circuitry that no single person in the entire world knows everything about. Just understanding how they work, from beginning to end, is a monumental goal in and of itself, which very few people, even in the computer world, have ever achieved. Computers (to be more precise, personal computers) represent decades and even centuries of continuous research and development. Would you really expect a single person to know, for instance, everything there is to know about any other field? Artistry? Music? Food? Architecture? Then how is it reasonable to impose this same stereotype on someone who, at best, actually happens to be the world's leading authority on some tiny portion of computer technology, and at worst, knows a bit more about computers than you do? Odds are it's the latter, and in most cases, you've barely even met this person!
The answer is, it's no more reasonable than any other stereotype - quite unreasonable, really. Expecting their help simply because they are the "local computer genius" - and because you haven't invested any time or energy into your own knowledge of computers - is even more unreasonable. When you want to know something about flowers, do you drive down to the local flower shop and lampoon the cashier with questions you already know she can't answer, calling yourself "flower illiterate" - or do you head to Wal-Mart, buy a gardening magazine and some seeds, then head home and grab your hoe and shovel? Why should your approach be any different with computers?
I can't blame you if some hardware breaks and you want someone else to install it. Hardware is a somewhat different story, since that is far more specialized knowledge. However, most non-hardware issues can be solved by moving the mouse to the right spot and clicking. This involves no digging of holes or planting of flowers, and certainly nothing anywhere near advanced as basic math or even reading and grammar - so why harbor the ridiculous stigma that "I will never be able to understand or use computers?" In 90% of computer problems, simply Googling your problem and then following the first set of instructions you find will get you halfway to solving the problem, and by then it wouldn't be rocket science to figure out the rest.
What We 'Computer Geniuses' DO Know
Not to sound like I'm working backwards, but now that I've covered all that - many computer geniuses may actually know quite a bit about computers! Some more than others. This shouldn't surprise you, but in light of everything I've said, it's still something to consider. It becomes pertinent, then, to understand more about the different categories of computer knowledge (again, as with any other field) so that you can begin to pinpoint the areas you need expertise in, and therefore, target those individuals who will actually be able to help you. Why do I say this? Odds are if you consider someone a computer genius, he or she is quite likely to try and help you regardless of your request, even despite a complete lack of knowledge in that area. It's true! I do it all the time.
Is it just to keep up the 'computer genius' image? Is it out of sheer helpfulness? I can't say for sure, but it would definitely make things less confusing if more people read this post! Even just guessing the area your problem falls under might save your computer genius the effort of figuring that part out. Even if you guess wrong, you will have impressed them just by trying to apply yourself. For instance, instead of saying "I'm computer illiterate," say something like "I think I'm having a network problem, can you help me?" By contrast, when I hear someone tell me "I'm computer illiterate" by implication they are expecting me to know everything and fix everything with little to no involvement from them.
Now, I certainly wouldn't offer to help anyone else if I didn't think I knew something about computers. In fact, I've studied into computers quite a bit just for the sake of learning them - what I know about computers is not all 'intuitive' and it did not all enter my head the first time I laid hands on a keyboard and mouse. It doesn't always come easy, either. Still, one might wonder, why don't I consider myself a computer genius?
Let me answer this question in a somewhat odd fashion. I will attempt to categorize all computer knowledge. This should open your mind somewhat to just how vast is the world of personal computers. Then I will estimate my level of knowledge for each category.
In my experience, here are the basic areas of computer expertise, and a brief description of what exactly each one is: software, the available programs, or lists of instructions (computer code) that tell computers what to do; hardware, the physical components that make up a computer; networking, the wires, cables, airwaves, satellite transmissions, and other such things that computers use to communicate with each other, as well as the involved hardware, software, and communication protocols; digital electronics, the super-tiny components that make up the hardware; programming, the art of writing, testing, and distributing software (computer instructions) that perform meaningful tasks; web design, the art of creating websites; server architecture, the computers that run web sites and the internet; and security, keeping data secure as it travels over the internet, and keeping computers from being infected with viruses (malicious programs). Here's the kicker - this is so high-level we have barely scratched the surface. Each of these areas is in itself an entire field, about which no one person could possibly know everything. Are you starting to get the picture?
Now, I will be using a percentage scale: 100% representing all available and possible knowledge about any one particular category, and 0% representing absolutely no knowledge in that area. Here is a breakdown of my own estimation of my computer knowledge and skill:
Software: 36%
Hardware: 3%
Networking: 4%
Digital Electronics: 7%
Programming: 22% (I am a software developer, mind you!)
Web Design: 11%
Server Architecture: 1%
Security: 18%
I think that should pretty well answer the question. My extreme lack of computer knowledge speaks for itself! I would define a true 'computer genius' as someone who has even 25% to 50% in each of these categories.
Don't assume by these low numbers that I am trying to be modest. This is the most accurate data that I could come up with. Think about this for a moment: of all the programming knowledge, even with me being a programmer - I believe I know less than a fourth of all knowledge about programming. That's in just one category, and my second highest!
What should really help drive the point home is when you consider the average computer problem. Here is another breakdown estimating what percentage of all computer problems (adding to 100%) occur in each category:
Software: 22% (not bad programming, just installation issues)
Hardware: 24%
Networking: 14%
Digital Electronics: 1%
Programming: 31% (this is basically the cause of most software problems)
Web Design: 5%
Server Architecture: 1%
Security: 2%
Since the average computer problem can be solved with a slightly-above-average collection of knowledge about software, hardware, and programming, something any nerd often pegged as a computer genius is quite likely to have, this explains how we appear to have "all knowledge." Networking generally takes care of the rest; however that ends up being the most common unsolved problem category, since very few nerds actually know enough about networking to solve most such problems. Typically problems finding their way to computer geniuses from outside sources range from simple in difficulty to about medium; however, as was just stated, the network category is a bit different. Since this level of knowledge falls at a much lower level of the overall computer architecture, the configuration and often mathematics involved is usually too difficult for your average computer genius to solve, myself included. In my case, I have less of an excuse for this than you do for computer knowledge in general - I graduated college a double major in computer science and math. For me, the gap is simply a lack of detailed study (or more precisely, interest) in the networking field.
'Computer Geniuses' Have Problems With Computers Too
Finally I would like to briefly refute the myth that computer geniuses never have computer problems of their own (since, obviously, they already know how to solve them all.) This is not only quite untrue, but the opposite is actually the case - computer geniuses, or those regarded as such, typically have far more problems, and more difficult ones at that, than people that consider themselves 'computer illiterate.' Why does this occur? It isn't just because computer geniuses can usually solve the easier problems on their own - it's primarily because they do more advanced things on a regular basis and change settings a lot more frequently. As such, they are far more likely to break something in these advanced settings.
As an example, one of my first experiences with a computer was in the Windows Registry. I won't go into any detail here about what that is - just know that it's an important part of Windows, the primary collection of software (called the OS or operating system) that gives you all those little icons, buttons, windows, scroll bars, and other nice things to click on. Well, I was messing around in there without understanding what this Registry was for or how it worked. I began changing some things and found that now my programs would not run. I really thought I'd broken that laptop for good! However, I played around with it some more and was luckily able to get it back to normal. I admit it did require some deductive skill and intuition about what was going on with this Registry program, but it still taught me what it does and why it is there. This is a problem 70% of computer users would never have, because 70% of computer users don't have the foggiest idea that the Windows Registry even exists, much less any desire to open it or start changing things.
Whew! This will likely be my longest post for quite a while - maybe the entire blog. I guess I had a lot to say about this. It's certainly plenty of food for thought. I'll leave you with one major tip about solving nearly ANY computer problem. If all else fails...throw it out the window! Forget versions, OS levels, and actually having to know anything about computers to begin with. This solution works! And it gives you a great excuse to go buy a new computer. Although, to avoid any legal controversies and/or murder charges, you may want to ensure there is no one walking around on the sidewalk below said window before you go and chunk a computer out of it.
Tuesday, June 5, 2012
Scobbed Knobs
While driving through some hilly country I noticed a knob (an old English word for 'hill' - yes it's true, look it up). It would not have been particularly interesting, except that it happened to be scobbed - a word I am using to refer to the state of...hmm, how do I put this? Picture an almost-completely-shaved head, being scratched all over by someone's knuckles or fingernails. That's a pretty good idea of what a scobbed knob looks like. The actual definition is "picked clean" - I dunno, you figure it out.
What's the point? None, really. I just find it amusing. That, and it rhymes.
This post is actually about success. Weird, I know. And here you were expecting a post all about scobbed knobs. Well, don't worry, there'll be more of that later.
I certainly won't claim that my definition of success is the de-facto standard, or any standard at all, for that matter. This is simply the way I see it, and I have found it to be true on a personal level. I would give most of the credit for this to Mr. Herbert W. Armstrong - the four ideas that weave their way through this post, and my story, are the last four of The Seven Laws of Success, a free booklet by Mr. Armstrong.
I don't believe success can be defined like most other words. Instead, I hope to give you a mental picture that explains the most abstract form of the concept. They say a picture is worth a thousand words...what does that make a mental picture worth - especially one delivered via words on a page?
Imagine you're running. Running literally, but also running out of breath. You are nearing the point where you start to shake and you stop sweating because your body has lost a lot of sweat and is starting to conserve fluid. That thick substance that is certainly too thick to be saliva (but what else could it be?) coats your entire throat, threatening to fling itself out randomly with each misplaced gasp, each breath in which you open your mouth a little too wide, and exhale a little too forcefully.
A few more minutes pass, and you push yourself to continue. This is your perseverance - the endurance to see something past the moment where you first feel like giving up. However, at this point, it isn't all that difficult to continue, either - you have momentum, after all, and it's easier on your body to continue in motion than to completely stop (thus the need for cool down laps after a long run).
Soon, you are really beginning to ache. You can feel the blood pulsing, almost pounding in your feet, and your pace slows without your consent. Your breath is more ragged and shallow, and each new step feels more painful than the last. Exhaustion is setting in, and you are starting to feel a tad dizzy.
What's this? An obstacle ahead! A large boulder looms before you, and there seems to be no way around it. There is a long drop off on either side of the path, and there isn't enough room on either side to squeeze by. It's looking like you'll have to stop completely.
But not you! In addition to perseverance (and the drive which put you on the road to begin with) you also have resourcefulness - you have trained your jump height, and you also notice some great footholds on the front of the boulder. Aiming carefully in the last few seconds, your body supplies a small adrenaline rush as you scale the boulder and land shakily on the other side, continuing your stride unbroken.
On a minor and unrelated note, the narrow hill upon which the boulder was sitting looked quite scraggly. The grass was patchy and ragged, and the whole bit could therefore be considered a scobbed knob. (See? I told you there'd be more. But just wait! There's even more yet. But I digress.)
Now that you can see ahead, past the boulder, you notice the trail is much, much longer and harder than you imagined. There are far more obstacles coming up, and you're not even to the halfway point yet. Discouragement sets in, and you begin to wonder if finishing the run is even possible at this point.
Here is the critical moment. The situation has changed from a few minutes ago. You have now come so far that it would be quite difficult to continue - you have almost as long a run back as you do forwards, though certainly with less obstacles, and a route you've already blazed through, no less. Familiarity is on your side. Going forward doesn't even seem to be an option - in fact it seems so difficult that giving up is precisely what you want to do, and it seems to be the only way out.
Going forward means success, regardless of the outcome. Going forward means you overcame your pain and exhaustion at the moment it mattered most, instead of giving up. Regardless of how far you are able to continue, odds are you will have gone farther than anyone else who ran down that same path.
Success, then, is largely a measure of your ability to discern these critical moments, and make the right choice at each one of them. This goes along nicely with one of my favorite quotes:
"To choose what is difficult all one's days, as if it were easy, that is faith."
-W. H. Auden
Thus, I ever so subtly imply (by including this quote) that faith is part of the measure of success. Quite a large part, I believe.
It's the end of the track. You made it. Breathing heavily, you ball up your hands into fists and use them to vigorously rub your head - partially to wipe the sweat off, and partially out of insanity and exhaustion. In the process, you've just scobbed your knob. It was inevitable - just like those critical moments where we choose, often unknowingly, between success and failure.
Failures are certainly stepping stones to success, I won't argue that - but just like stepping stones, if you spend all your time on them, going in circles, and don't make any progress toward your goal, you're likely to end up wearing them out - leaving you with a worn out path of scobbed knobs.
So you might as well accept the fact. Your knob is going to get scobbed one way or the other - why not make a success out of it?
I'll leave you with a short poem I came up with on the spot just for this post. It's about scobbed knobs, of course! It's written in my favorite form of verse, the limerick. However, most limericks are only a single stanza, and I seem determined to break that trend.
Scobbed Knobs
On a hill lived a farmer named Bob,
Where he planted some corn on the cob.
But the crop was so small,
That the birds ate it all.
All they left was a very scobbed knob.
Now the cobs were also quite scobbed,
And poor Bob, he really felt robbed.
He made the cobs into cobbler,
So the birds would gobble-er,
And over the fence it was lobbed!
The birds, they ate and they gobbled,
Then they reeled and babbled and hobbled.
Then they flapped and slobbered,
They knew they'd been clobbered.
By Bob's scobbed-knob cobs, they'd been cobbled.
What's the point? None, really. I just find it amusing. That, and it rhymes.
This post is actually about success. Weird, I know. And here you were expecting a post all about scobbed knobs. Well, don't worry, there'll be more of that later.
I certainly won't claim that my definition of success is the de-facto standard, or any standard at all, for that matter. This is simply the way I see it, and I have found it to be true on a personal level. I would give most of the credit for this to Mr. Herbert W. Armstrong - the four ideas that weave their way through this post, and my story, are the last four of The Seven Laws of Success, a free booklet by Mr. Armstrong.
I don't believe success can be defined like most other words. Instead, I hope to give you a mental picture that explains the most abstract form of the concept. They say a picture is worth a thousand words...what does that make a mental picture worth - especially one delivered via words on a page?
Imagine you're running. Running literally, but also running out of breath. You are nearing the point where you start to shake and you stop sweating because your body has lost a lot of sweat and is starting to conserve fluid. That thick substance that is certainly too thick to be saliva (but what else could it be?) coats your entire throat, threatening to fling itself out randomly with each misplaced gasp, each breath in which you open your mouth a little too wide, and exhale a little too forcefully.
A few more minutes pass, and you push yourself to continue. This is your perseverance - the endurance to see something past the moment where you first feel like giving up. However, at this point, it isn't all that difficult to continue, either - you have momentum, after all, and it's easier on your body to continue in motion than to completely stop (thus the need for cool down laps after a long run).
Soon, you are really beginning to ache. You can feel the blood pulsing, almost pounding in your feet, and your pace slows without your consent. Your breath is more ragged and shallow, and each new step feels more painful than the last. Exhaustion is setting in, and you are starting to feel a tad dizzy.
What's this? An obstacle ahead! A large boulder looms before you, and there seems to be no way around it. There is a long drop off on either side of the path, and there isn't enough room on either side to squeeze by. It's looking like you'll have to stop completely.
But not you! In addition to perseverance (and the drive which put you on the road to begin with) you also have resourcefulness - you have trained your jump height, and you also notice some great footholds on the front of the boulder. Aiming carefully in the last few seconds, your body supplies a small adrenaline rush as you scale the boulder and land shakily on the other side, continuing your stride unbroken.
On a minor and unrelated note, the narrow hill upon which the boulder was sitting looked quite scraggly. The grass was patchy and ragged, and the whole bit could therefore be considered a scobbed knob. (See? I told you there'd be more. But just wait! There's even more yet. But I digress.)
Now that you can see ahead, past the boulder, you notice the trail is much, much longer and harder than you imagined. There are far more obstacles coming up, and you're not even to the halfway point yet. Discouragement sets in, and you begin to wonder if finishing the run is even possible at this point.
Here is the critical moment. The situation has changed from a few minutes ago. You have now come so far that it would be quite difficult to continue - you have almost as long a run back as you do forwards, though certainly with less obstacles, and a route you've already blazed through, no less. Familiarity is on your side. Going forward doesn't even seem to be an option - in fact it seems so difficult that giving up is precisely what you want to do, and it seems to be the only way out.
Going forward means success, regardless of the outcome. Going forward means you overcame your pain and exhaustion at the moment it mattered most, instead of giving up. Regardless of how far you are able to continue, odds are you will have gone farther than anyone else who ran down that same path.
Success, then, is largely a measure of your ability to discern these critical moments, and make the right choice at each one of them. This goes along nicely with one of my favorite quotes:
"To choose what is difficult all one's days, as if it were easy, that is faith."
-W. H. Auden
Thus, I ever so subtly imply (by including this quote) that faith is part of the measure of success. Quite a large part, I believe.
It's the end of the track. You made it. Breathing heavily, you ball up your hands into fists and use them to vigorously rub your head - partially to wipe the sweat off, and partially out of insanity and exhaustion. In the process, you've just scobbed your knob. It was inevitable - just like those critical moments where we choose, often unknowingly, between success and failure.
Failures are certainly stepping stones to success, I won't argue that - but just like stepping stones, if you spend all your time on them, going in circles, and don't make any progress toward your goal, you're likely to end up wearing them out - leaving you with a worn out path of scobbed knobs.
So you might as well accept the fact. Your knob is going to get scobbed one way or the other - why not make a success out of it?
I'll leave you with a short poem I came up with on the spot just for this post. It's about scobbed knobs, of course! It's written in my favorite form of verse, the limerick. However, most limericks are only a single stanza, and I seem determined to break that trend.
Scobbed Knobs
On a hill lived a farmer named Bob,
Where he planted some corn on the cob.
But the crop was so small,
That the birds ate it all.
All they left was a very scobbed knob.
Now the cobs were also quite scobbed,
And poor Bob, he really felt robbed.
He made the cobs into cobbler,
So the birds would gobble-er,
And over the fence it was lobbed!
The birds, they ate and they gobbled,
Then they reeled and babbled and hobbled.
Then they flapped and slobbered,
They knew they'd been clobbered.
By Bob's scobbed-knob cobs, they'd been cobbled.
Subscribe to:
Posts (Atom)