Showing posts with label philosophy. Show all posts
Showing posts with label philosophy. Show all posts

Sunday, October 27, 2013

The Sop of Originality


Quick, which band is the originator of grunge music?

I bet most of you—something on the order of 97 to 99% of you, in fact—replied “Nirvana.”  Which is a lovely answer: their radio anthem “Smells Like Teen Spirit” is what introduced grunge music to the world.  I can remember the first time I heard it: it was Industrial Night at the Roxy, in downtown DC, in 1991.  I won’t go so far as to say it changed my life—I was already very much into alternative music, otherwise why would I have been attending Industrial Night?—but it certainly jolted my system.  I had no idea it was about to take the airwaves (and, shortly thereafter, the nation) by storm, but I knew this was something ... special.  Something profound.  It’s 22 years later now and I’m still listening to new songs from the Foo Fighters coming on the radio: that’s a decent run for any modern band and its descendants.  It doesn’t rival the Beatles or the Stones, but it’s a damn fine run, and it ain’t over yet.

But of course Nirvana didn’t invent grunge music.  The first incarnation of Nirvana came together in 1985 or ‘86.  Soundgarden had already been around for at least a year, as had Green River, who begat Mother Love Bone, who begat Perl Jam.  Green River’s roots, in fact, go back as far as 1980, and the roots of the Melvins go back to 1983, at least, and they together spawned Mudhoney, who is certainly the best Seattle grunge band you’ve never heard of, hands down.

And Seattle is the birthplace of grunge, right?  Here’s what Kurt Cobain said about writing “Smells Like Teen Spirit” to Rolling Stone:

I was trying to write the ultimate pop song.  I was basically trying to rip off the Pixies.  I have to admit it.  When I heard the Pixies for the first time, I connected with that band so heavily that I should have been in that band—or at least a Pixies cover band.  We used their sense of dynamics, being soft and quiet and then loud and hard.


And the Pixies, you see, were from Boston, whose grunge scene is underrated nearly to the point of being unknown, even though it included great (but little-known) bands like the Pixies, Buffalo Tom, and of course Dinosaur Jr., who formed in 1983 and not only wrote what is arguably the best grunge quatrain ever:

I know I don’t thrill you
Sometimes I think I’ll kill you
Just don’t let me fuck up, will you
‘Cause when I need a friend it’s still you


but also what is surely the greatest remake ever.

But what is the point here?  (Other than to re-educate you on the finer points of grunge music, naturally.)  I think the point is that some Nirvana fans may be offended by my pointing out they didn’t invent grunge, they merely popularized it. As if that somehow takes away from their genius.  Am I saying that Nirvana is just a rip-off of the Pixies?  No, Kurt Cobain said that.  I think I’m saying that originality is overrated.  It’s held up as some sort of sacred cow, and, if a thing isn’t original, it’s therefore inferior.  But Nirvana is not inferior to the Pixies ... I’m not saying they’re better, merely that they’re not any worse.  Coming in second or third or fifth or tenth in the chronological list of grunge bands doesn’t make them any less insanely good than they truly are.  Everyone had done what they did before, but no one ever did it like they did, before or after.  Why do we care if they were first or not?

We can move into the wider world of music.  Can there be a Lady Gaga without Madonna?  No, not really.  Does that make Lady Gaga a “Madonna rip-off”?  Certainly not in the pejoritive way that the phrase is generally used.

We’ll expand to movies.  Can Dark City exist without Metropolis?  No, certainly not.  Hell, I’m not sure Dark City could exist without The City of Lost Children, but that doesn’t make Dark City any less brilliant.  Hell, I’ve heard it argued that The Matrix doesn’t exist without Dark City (although their releases are close enough together that it’s more likely a pair than a rip-off), but that doesn’t take anything away from The Matrix either.

Comic books: I’ve always loved Moon Knight.  Moon Knight is a rich guy who fights at night with a mysterious, scary costume and uses a lot of gadgets ... sound familiar?  Yeah, Moon Knight is pretty much a Batman rip-off.  So what?  How does that make him any less cool?

Literature: I’ve already talked about how I feel about the Wheel of Time series being accused of being a Lord of the Rings rip-off.  I’ve also heard it accused of being a Song of Ice and Fire (a.k.a. Game of Thrones) rip-off, which is amusing, since the first book of Wheel of Time was published before George R. R. Martin even started writing the first book of Song of Ice and Fire.  But let’s say you’re willing to flip it around and accuse Martin of ripping off Jordan instead: I still say, so what?  If it were true that Martin deliberately and consciously sat down and said “I’m going to rewrite Wheel of Time, only better” (and I truly don’t believe he did), who cares?  What Martin produced is still awesome.  You could argue whether it’s better than Jordan or not, but, in the end, it’s different, and they’re both very good.  They could have been ripping each other constantly throughout the respective series (which, although it’s true that Jordan started first, were being published simultaneously), and I would only be grateful for the cross-pollination.  It’s not like whoever got there first gets more points or something.

In my discussion about the Wheel of Time question, I made another analogy: Harry Potter being described as a rip-off of James and the Giant Peach.  I chose it for a number of deliberate reasons.  The most obvious being that James and the Giant Peach was published 4 years before J. K. Rowling was even born, so it completely eliminates any question of whose idea came first.  Also because I don’t think it’s a criticism that’s ever actually been made; rather it seems to be the case that any series which is even remotely like Harry Potter is proclaimed to be a rip-off of it: A Series of Unfortuante Events, Percy Jackson and the Olympians, Artemis Fowl, the Bartimaeus trilogy, the Septimus Heap series, Children of the Red King, The Secrets of the Immortal Nicholas Flamel, The Wednesday Tales, etc etc ad infinitum.  But of course the one that concerns me is the one that I’m currently engaged in writing (assuming I ever get back to it), Johnny Hellebore.

So this question of originality hits home for me, and I must admit I have an ulterior motive.  It only occurred to me after Johnny Hellebore was completely fleshed out as a character that he shares a lot of similarites to Harry Potter, especially physically.  He’s a white, English-speaking, male, teenaged boy, thin, with black hair and eyes that are some shade of green.  The differences, particularly at this level are so slight as to be laughable: American instead of British, a bit older, eyes more blue-green than Harry’s piercing green.  They’re both parentless, although Johnny isn’t an orphan, and one might even go so far as to make a comparison between Larissa and Hermione (although I feel that’s unflattering to Hermione, really).  The farther along you go, of course, the more you have to struggle for the similarites against the profound differences instead of the other way around, but by that point you’ve established your foundation, and your audience is more likely to grant you the benefit of the doubt.  And, while I’m telling you that all of this only occurred to me after the fact, you only have my word for that, no?

For that matter, while I can assure you that I was not consciously trying to “rip off” Harry Potter, how can I make any definitive statements about what my subconscious may or may not have been up to?  I certainly had read the Harry Potter books—several times—as well as listened to the audiobooks and watched all the movies.  And I respect the hell of out J. K. Rowling: she’s a dead brilliant author with an envy-inpsiring talent for both characterization and plotting that I certainly could do worse than to emulate.  So was Harry kicking around in the back of my brain, casting an influence on this idea?  I’m sure he must have been.

Still, Johnny Hellebore is an entirely different story than Harry Potter.  One is aimed at younger readers, though it’s good enough that older readers will appreciate it as well; the other is aimed at older readers, and, though younger readers may certainly appreciate it, it requires a much higher maturity level.  One focuses on a sense of wonder and a fierce joy that only slowly becomes eclipsed by the darker themes of the series; the other is dark from the very first page, and it’s the joy and wonder that serve as the counterpoint.  One is a story of a boy growing into a man; the other is a story of a boy who is in many ways a man already, but who exists in a state of being “stuck”—not necessarily stuck in childhood, but just in a deep a rut in his life, which is a state that all of us experience, at many different points in our lives.  One was very likely influenced by Roald Dahl; the other is more likely influenced by Stephen King.

Still, the comparisons will inevitably be made, and, on one level, I find it flattering.  As I say, Rowling is a brilliant author and even to be mentioned in the same sentence as her is quite nice.  Still, one doesn’t want to be thought of as a rip-off, right?  But then that got me wondering ... why not?

It seems to me that we’ve somehow elevated originality into some Holy Grail.  Everything has to be original.  Except ... nothing is original.  At this point in human history, everything can be said to be derived from, descended from, influenced by, or in the vein of, something else that we’ve seen or heard or read before.  There’s just so much out there ... how could you not sound familiar, even if only by accident?

So, I say, let’s set aside originality.  Can we not rather ask—should we not rather ask—it is good?  Who cares whether it’s original or not, as long as it’s valuable, inspirational, emotionally involving, socially relevant, philosophically touching, mentally engaging ... does it speak to you?  If it does, then doesn’t it deserve to be evaluated on its own merits?  I think it does.  I’ll take my Nevermind and my Doolittle, thank you very much.  They’re both pretty damn rockin’.









Sunday, August 4, 2013

Cynical Romanticism


Many of my friends seem to think I’m a pessimist.  They’re then quite surprised when I seem to display some trait of stunning (and often naive) optimism.  The truth is that I’m not a pessimist; nor do I have moments where I transition to being an optimist.  I am, in fact, a cynic.  But I’m also a romantic.

I’ve mentioned this dichotomy of mine before (more than once, even).  What does it really mean though?  To understand, it’s useful to examine the roots of both terms.

Cynicism is actually an ancient Greek philosophy.  You remember the story of Diogenes, don’t you?  (Of course you don’t—I shouldn’t either, really, but my mother had an odd idea of what constituted a well-rounded education.)  Anyway, Diogenes was the guy who lived in a bathtub on the streets of Athens.  He carried around a lamp in the daytime, waiting for someone to ask him why.  When they did, he would reply that he was looking for an honest man.  The Cynics were sort of proto-hippies, living “in accord with Nature” and eschewing things like wealth and fame as non-natural.  It wasn’t enough to reject these things, though: a Cynic was required to practice shamelessness (sometimes translated as “impudence”), by which they meant that they should deface laws and social conventions.  So they were sort of in-your-face hippies.

You can vaguely draw the connection from this attitude of telling everyone that they were fools for letting things like greed and conformity take them further and further away from the natural state of living and the ultimate meaning that cynicism has today: the belief that people as a whole are vain, gullible, avaricious, and generally not that bright.  Steve Jobs once said:

I’m an optimist in the sense that I believe humans are noble and honorable, and some of them are really smart.  I have a very optimistic view of individuals.  As individuals, people are inherently good.  I have a somewhat more pessimistic view of people in groups.


Although I’ve always preffered the version from Men in Black:

J: People are smart.  They can handle it.
K: A person is smart.  People are dumb, panicky, dangerous animals, and you know it.


J has no answer to this, of course: he is a New York City policeman.  He does know it.

When I was young, I did the required stint in fast food.  My particular greasepit was Burger King.  I worked there when chicken tenders were introduced, and I lived through the “Where’s Herb?” campaign.  Part of this campaign was to get a particular burger for a dollar by mentioning the fictional Herb’s name.  During this period, I saw innumerable people in Burger King ordering a “Herb burger.”  Yes, that’s right: they had absolutely no idea what they were ordering—just that it was cheap.  I am fond of telling people that we could have served them a shit sandwich and they’d have been happy as long as they thought they were getting a bargain.  I’m also fond of telling people that Burger King is where I first began to lose my faith in humanity.  Looking back, I’m not sure that’s entirely true, but I can’t deny that ol’ Herb played a large role in pushing me down that road.  Certainly it’s the place where I learned to appreciate H. L. Mencken’s observation that nobody ever went broke underestimating the intelligence of the common man,* and surely Mencken is a big a cynic as Twain or Voltaire, two of my most cherished quotemeisters.

So do I have, as Wikipedia puts it, a “general lack of faith or hope in the human race”?  Yeah, pretty much.  My experience with politics, business, financial institutions, organized religion, and even smaller coteries of humanity such as neighborhood homeowner’s associations or Internet forum denizens tells me that, if you expect the worst from people, you’ll rarely be disappointed, and occasionally you get a pleasant surprise.  Which is much better than the inverse: expecting the best yields constant disappointment and the occasional situation where your expectations are merely met.  Thus, I’m entirely comfortable with being considered a cynic, even though I don’t think that’s the same as being a pessimist.  I’m happy enough to consider the glass half-full ... I just remain convinced that there’s every likelihood that someone else will come along and drain the glass before I get any.

Romanticism is also tied to nature: it was in some ways a revolt against the rationality of the Enlightenment, a way of stressing that one should go out into untamed Nature and stop trying to analyze it and categorize and just feel it.  Romanticism was a validation of strong emotion—be it wonder, awe, passion, or even horror.  Especially for Art.  As one early Romantic German painter put it, “the artist’s feeling is his law.”  This was a movement of rejecting rules, particularly rules about Art, and it led to the Gothic horror tale and luminaries such as Edgar Allen Poe ... it’s certainly no wonder that I would experience a feeling of kinship towards it.

“Romantic” as a term implying love came later.  Even before Romanticism, “romance” was a term that referred to knights and heroic quests: Shakespeare’s The Tempest was considered a romance.  From knights to chivalry, and rescuing damsels in distress, plus Romanticism’s emphasis on strong emotions (such as passion), we eventually came to think of “romance” as primarily a love story, which today leaves us with Harlequin and Titanic.  Sort of a step down from Romanticism, if you think about it.  Not that there’s anything wrong with romantic love, of course: just that love is only one small part of Romanticism.

As a would-be-writer who idolizes Steven King (among others), how could I not be attracted to the movement that gave us Poe?  Certainly there is no King (nor Straub, Koontz, Barker or Gaiman) without Poe.  This is a movement that also (albeit more indirectly) gave us Robert Browning, who I quoted in my deconstruction of one of my all-time favorite quotes, and who also inspired King’s Dark Tower series.  Like Cynicism, Romanticism was a rejection of rules, and especially the “rules” of conforming to a polite society.  Throw off the chains of conformity, they both proclaim.  Be an individual.

And that’s the heart of my outlook.  Note how both Jobs and Tommy Lee Jones laud the individual person.  And we don’t have to look far to hear more famous people doing so.  Margaret Mead once said:

Never believe that a few caring people can’t change the world.  For indeed that’s all who ever have.


How can you not take inspiration from thatPearl S. Buck said:

The young do not know enough to be prudent, and therefore they attempt the impossible—and achieve it, generation after generation.


So I believe that, despite the fact that humanity in general is close to useless, every individual human has a potential for greatness.  I believe that the universe works hard to put me in good places, and succeeds a surprising percentage of the time, even when the formless churning rat race of mankind is working hard to push in the opposite direction.  I won’t say I’m an optimist—the glass may indeed be half-empty.  But somewhere out there is a person who’s willing to refill it for me.  If I’m fortunate, and if I really need it, I’ll meet them.

This is not a philosophy so much as an outlook.  If you ask me about my philosophy, I’ll go back balance and paradox.  But that theory is how I attempt to make sense of the world when it doesn’t seem to want to make sense on its own.  That’s different from how I approach the world, and what I expect out of it.  When it comes to that, I don’t expect much out of people, but I will never give up my idealism.  The world doesn’t owe me anything, and I wouldn’t expect to receive payment if it did.  But I continue to believe that the universe is a decent enough place, and that there will always enough light to balance the dark, and that what you give out will surely come back to you.  In the end, Good will always triumph over Evil, even if Evil usually gets more votes (and always has better financial backing).

So I suppose it’s a bit like Mel Brooks says in The Twelve Chairs:

Hope for the best.  Expect the worst.
Life is a play.  We’re unrehearsed.


Although I would favor the formulation of Benjamin Disraeli:

I am prepared for the worst, but hope for the best.


Because I’m not a Romantic Cynic, after all: I’m a Cynical Romantic.  I may start with dread, but I always try to end on a note of hope.



__________

* Technically, what he said was: “No one in this world, so far as I know—and I have searched the record for years, and employed agents to help me—has ever lost money by underestimating the intelligence of the great masses of the plain people.”










Sunday, July 21, 2013

Restoration of balance: anticipated


Once again I’m going to forego my usual blog post, this time because I don’t really have anything to say I haven’t said before.  About a year and a half ago, I posted a rather long, rambling musing on the topic of fate.  This week I’m reminded of that posting, in a very positive way.  I continue to believe that everything happens for a reason, and even things that seem to be bad at the time can often lead to a better outcome than expected (or hoped).  Not always, of course.  But the universe has a way of tempering the bad with the good in such a way that makes it difficult not to believe that there is an ordered plan.  My current plan is still ongoing, so I can’t guarantee it’ll end up where I think it will.  But, so far, my faith in the universe is back on track, and aiming at a destination that I’m currently pretty excited about.

Until next week.

Sunday, February 24, 2013

Little Things Add Up

It has long been an axiom of mine that the little things are infinitely the most important.

Sherlock Holmes (“A Case of Identity”, Sir Arthur Conan Doyle)

A few years ago, the architecture team at my work (of which I am a part) put together a presentation for the business designed to explain why a serious rearchitecture was important.  We all contributed ideas and analogies and metaphors, and different ways to illustrate the problem.

One of the ones that I contributed was this:  Many times throughout your work week as a programmer, you run across things in our ten-year-old codebase that you just don’t understand.  Things that look insane.  Things that look like they couldn’t possibly work, and may in fact represent subtle bugs that no one’s ever been able to catch.  When you find such a thing, you have two choices: you can ignore it, or you can fix it.

If you fix it, you risk breaking something.  This, after all, is the source of the ancient adage that “if it ain’t broke, don’t fix it.” By “correcting” something without a complete understanding of just what the hell it’s supposed to do, you may correct one subtle bug only to introduce another.  The new bug could be worse.  It could cause a loss of revenue that’s not immediately obvious, and you might end up six months later in some business meeting trying to explain how you cost the company tens of thousands (or hundreds of thousands, or millions) of dollars because you “fixed” something that no one asked you to.  If you are a corporate programmer with any reasonable amount of experience, this has already happened to you in your career.  Probably more than once, even.

So, for any one given situation like this, the smart thing to do is to ignore it.  That is, from a risk vs reward perspective, or from a return-on-investment perspective (both of which are very proper business perspectives), the right thing to do, the responsible thing to do is to just leave it and move on.  Because the advantage of making your codebase just a tiny bit more sensible and sane isn’t worth that risk.

But the problem is, all those little things add up.  When you stand back and look at the big picture, over the course of ten years you’ve made thousands (or even tens of thousands) of individual decisions where each one was the right decision individually, but together they spell disaster.  Together, this approach means that you are literally incapable of ever improving your code.  Your code is, by definition, continually going to get worse.

Among the maxims on Lord Naoshige’s wall there was this one: “Matters of great concern should be treated lightly.” Master Ittei commented, “Matters of small concern should be treated seriously.”

Yamamoto Tsunetomo, Hagakure

It’s very popular in business culture to quote Sun Tzu.  After all, the competition of companies often seems like a war.  Plus pithy quotes like “attack the enemy’s strategy” and “speed is the essence of war” sound really cool when you break them out in a business meeting.  In reality, The Art of War does have quite a few valuable lessons for us.  For instance, “lead by example” (technically, “a leader leads by example not by force”).  That’s pretty good advice.  Or how about this one: “Pick your battles.”

Actually, what Sun Tzu said was: “Thus it is that in war the victorious strategist only seeks battle after the victory has been won, whereas he who is destined to defeat first fights and afterwards looks for victory.” But most people interpret that as just a more flowery version of “pick your battles.” And plenty of people have expounded on this theme.  Jonathan Kozol, educator and activist, once said: “Pick battles big enough to matter, small enough to win.” It seems like where Sun Tzu started was an aphorism on trying not to get into battles you know you can’t win.  But where Kozol, and most of us, seem to have ended up is closer to the military version of “don’t sweat the small stuff.” That is, learn to let the little things go so you can save your strength for the big ones, the ones that are really worth fighting for.

And certainly this seems to make good sense.  If you’re going to be losing any battles, they probably ought to be the ones that don’t matter as much ... right?  As we contemplate where to direct our energy, where to concentrate our efforts, when each little battle comes along, it’s always going to make sense to let it go and wait for the big problem that’s inevitably going to show up tomorrow.  But, the problem is, there’s a flaw in this reasoning.

See, little things add up.

I think we’re looking at the wrong end of the trite maxim spectrum.  Maybe we should be considering that “mighty oaks from little acorns grow.” Or that “a journey of a thousand miles begins with a single step.” This is what disturbs me about the (quite common) corporate attitude that, as employee freedoms are eroded bit by bit in the name of increased efficiency, each such loss is a small battle, and not worth fighting over.  I often speak out about such things, when I have the opportunity.  And I’m often told that I need to learn to let these little things go, because, you know, there are bigger fish to fry.  Pick your battles and all that.

So I get a lot of eyerolls, and shaken heads, and derisive snorts, because I’m making mountains out of molehills.  I need to go along to get along.  Because my argument is one of those “slippery slope” arguments, and you know how silly those are.  Let gay people get married today and tomorrow we’ll have legalized bestiality.  Stop people burning flags and next thing you know our free speech is gone.  That sort of thing.  Poppycock.

But here’s the thing.  When you first start a job, the world is full of possibilities.  And the environment of the place—the culture—is awesome ... if it weren’t, you wouldn’t have taken the job, right?  And if, later, after you’ve been there for a while, some little small thing that attracted you to the job is taken away, it’s not a big deal, right?  There’s still all the other things you liked.  And, the next year, if one other little thing disappears, it’s still no big deal, right?  This job is still far and away better than anything else you could find out there.  And if, the year after that, one more little thing is taken away ...

Here’s another saying for you: death by a thousand cuts.  None of those individual cuts hurt, really, but one day, you just realize that there’s no point in going on.  And you start to question whether it really is true that there couldn’t be something better out there.  And you toss off an email to some random recruiter and next thing you know you’re moving across the country to an even more awesome job.  (But then I’ve told this story before.)  And then you start the whole cycle all over again.

The really sad thing is that, no matter how often this happens, the corporate managers will never see it coming.  See, from their perspective, people are quitting over stupid, trivial things.  And people that will quit over stupid, trivial things ... you don’t want those people anyway, right?  There was nothing you could do.  They were unpredictable.  Anything might have set them off.

They’re missing the big picture.

They let the little things slide, because they weren’t worth fighting for, and the little things added up.  There were haystems snapping dromedary spines right and left, to coin a phrase.  Is it really true to say there was nothing they could have done?  I wonder ...

I wonder what Sun Tzu would say if asked that question.  It’s sort of difficult to know for sure, seeing as how he’s been dead for about 2,500 years.  But I could hazard a guess.  I think he’d say this:

Treat your men as you would your own beloved sons. And they will follow you into the deepest valley.

Sun Tzu, The Art of War










Sunday, October 7, 2012

Things We Lose in the Timestream


Well, as it turns out, I’m swamped with doing stuff this weekend, so there’s no blog post for you.  Not that you should care anyway, of course.

You know, it occurs to me that occasionally I come along and tell you I’m busy with stuff, but I never actually tell you what the stuff is.  In retrospect, this seems unfair.  You come along (despite repeated warnings to the contrary, even), expecting to see some blather you can kill some time with, and here I am telling you there’s nothing to be read and not even bothering to say why.  Well, fear not, gentle reader: today I shall regale you fully with tales of my goings-on.  And this shall, hopefully, convince you never to wonder again.

So, firstly: often people mention what they’re reading, or listening to, and all that sort of thing.  Right now I am currently working on book 7 of the Dresden Files, and determined to push all the way through to the end.  It’s just getting really good (it was good before, but now it’s really good, if you follow me).  Musically, I recently picked up a digital copy of Extractions by Dif Juz, which is one of the 4AD bands that I somehow missed all this time.  I knew Richie Thomas’ fine saxophone work from Victorialand, of course, but I’d never heard this album before, and it’s quite good.  You should give it a listen if you’re into dream or ambient or that sort of thing.  Visually, I just picked up my Blu-ray of The Avengers, which of course I had seen in the theater, but it was just as good the second time around.  That Joss Whedon really knows what he’s doing behind a camera; I hope he does more of the superhero movies.

Now, of course, all of that is not really keeping me from writing.  There must be other stuff going on around here ...

Well, I do still have a few hours to put in for $work.  I would tell you a bit about my work, but I’ve had to sign so many things at this point saying that I will not ever “disparage” the company that I feel a bit like Stephen Colbert talking about Islam: my company is a great and true company and Blessings and Peace be upon my corporate overlords.  Don’t point, even.  You’ve seen enough of that one.

So, tomorrow I have to do a presentation for my new co-workers (of which there are quite a few), plus I have a meeting about my current project, which I just started, and I’d really like to learn a bit more about it before I have to start explaining it to other people.  But mainly I want to prepare a bit more for the presentation.  I could just wing it, and I’d probably do fairly well, but the more prepared I am, the better I’ll do (most likely), and one does want to make a good impression on people that you’ve just hired.

Let’s see ... what else ... well, there are still some weekend chores left, despite the fact that I generally try to knock those out before Saturday night, or else I find I have no time to myself.  I’ve still got to go to the grocery store, and direct my older children to clean the den so that you can actually walk in there again.  (The youngest is excused from such things, although I’m sure she’d like the floor to be cleared as well, as right now there are a lot of things blocking her from getting to the catfood, which just pisses her off.  Nothing feels quite the same rolling around inside your mouth as a big ol’ handful of catfood.)

In hobby news, the things which are supposed to be my relaxation from other parts of my life occasionally have the power to provide their own sources of stress.  For instance, in my role as a CPAN author, right now I’m about three issues behind in taking care of some issues for the Method::Signatures module I work on, and one of them is for a guy who’s fairly well-known in the Perl world (and, even if you don’t have any idea what I’m talking about, the fact that the guy has his own Wikipedia page should give you a clue).  And, in my work to keep my favorite game going, we’ve been working hard to address a number of issues with one of our recent releases.

So, there’s lots to do, and (as always), little time to get it all done.  It seems that, the older I get, the less likely I am to have one big excuse for not doing what I should.  I mean, remember, back in college, when the reason you didn’t finish your essay for class was because of a party (or the resulting hangover), or one of your friends broke up with their boyfriend or girlfriend and you were up all night with them, or you were helping someone move?  But nowadays it’s never one big thing; it’s a million little things, that peck away at your time jot by jot, frittering away your ability to focus in dribs and drabs.  It’s death by a thousand cuts.  But such is the way of life.  The older you get, the more you take on, I suppose, and the more people you meet, the more that end up depending you for one thing or another, in large ways or in small.

Which is all a very roundabout way of saying to you, my oh so persistent blog connoisseur: no cookie for you!  Not this week, in any event.

Sunday, September 23, 2012

I'm too old for this shit ...


I believe in self-reflection and self-analysis.  (Of course, I also believe that such things are necessarily flawed, but perhaps that’s a topic for another blog post.)  I think it’s important to know what your faults are, what your limitations are.  Of course, I think that sometimes people want to identify their faults so they can correct them.  I have a slightly different approach:  If I can’t identify all my faults, I’m a blind moron, bumbling through life not even knowing the damage I’m doing.  Contrariwise, if I can identify all my faults, and if I could somehow correct them all, then I would be perfect.  I know that I cannot ever be perfect.  Therefore, either I’m never going to be able to see all my faults, or I’m going to be able to see them all but never fix them all.  I choose the latter.

That is, there are some faults that I have that I’ve just learned to live with.  They’re bad, sure, but they’re not so bad, and, if one has to have faults anyway (and, lacking perfection, one does), you may as well have some that aren’t so bad, right?  For instance, I’m too loud.  I have a naturally loud voice, and it carries, and the more excited I get about a topic, the louder I get.  Especially in an office environment, I’ve been asked many times throughout my life to keep it down.  Another problem I have is that I get pissed off at little things.  Not things that people do, so much: more like inanimate objects.  Like if I drop a cup and spill water all over the place, I am pissed at that cup.  This is moronic.  I know this.  But I still do it, and mostly I can live with that.

Now here’s the fault that I wanted to talk about today: I try to be too helpful.  Yeah, yeah, I know that sounds like one of those bullshit “flaws” that you dredge up during an interview.  (“Mr. Jones, what would you say is your biggest failing as an employee?”  “Well, sir, I’ve often been told that I just work too gosh-darned hard.”)  But note that I’m not claiming that I actually am too helpful, only that I try to be.  And, really, it isn’t correct to say that I try to be too helpful ... the truth is that I try too hard to be helpful, which is subtly different.

If you ask me a question, I want to give you the right answer.  If I can’t give you an answer, I feel bad.  Like, unreasonably bad.  Much worse than I would if I were to screw you out of a parking spot—worse even than if I were to screw you out of a job (unless perhaps I knew you personally).  That’s messed up.  But that’s the way I am.  If I give you an answer and it later turns out I was wrong, that’s even worse: then I feel hideously awful.  I have friends that think I have a burning need to be right.  I don’t think that’s true.  My father, for instance, has a burning need to be right.  He doesn’t ever admit that he was wrong.  I, on the other hand, have absolutely no problem admitting I was wrong: I just feel really crappy about it, if I think that someone was misled somehow (and that’s nearly always true, unless you were talking to yourself or something).  It’s sort of like a savior complex, but on a smaller scale.  I don’t feel the need to save people, only help them out a bit.

And, at first blush, this doesn’t seem so bad.  So I go out of my way to help people; what’s wrong with that?  Someone with a savior complex often has the problem of taking care of others so much that they forget to take care of themselves, but I don’t have that issue.  So where are the downsides, and how is this a fault?  Well, there are two main areas that I’ve identified, one smaller, and one larger.

The smaller issue is that I’m so constantly afraid of giving people the wrong information that I often over-qualify all my statements.  Now, I’ve talked before about my fear of absolute statements.  So, in one sense, this is just another facet of that.  But it goes further, I think: if I qualify everything I say to a large enough extent, I can never be giving you misleading information, right?  Many of my friends think I’m wishy-washy.  I don’t think that about myself, but I certainly understand why they do, and this is at the heart of it.

But here’s the bigger problem.  When I think someone is wrong, I have a desperate desire to “help” them by correcting their misconceptions.  Which can be okay, sometimes, if the person is receptive to that sort of thing, but often people aren’t.  And that just makes me try harder.  Which is code for “I’m a jerk about it.”  And, of course, it’s one thing if it’s a fact we’re discussing.  If I can tell you that you’re wrong, and we can look it up on Wikipedia or somesuch, then the question will be settled.  You may not appreciate my correcting you (especially if I did it in public), but at least there’s no more arguing about it.

But suppose it’s more of a matter of opinion.  Now, I’m okay if you have your own opinion about something.  If you have an intelligent, informed opinion, and I just happen to disagree with you, then fine.  I don’t have a need to “correct” you then, because you’re not really wrong.  But, let’s face it: most poeple’s opinions are not intelligent, informed opinions, and that includes mine.  I try (really!) to have the good grace to back down when it’s obvious that you know more about something than I do, but I find that I’m in a minority there, and sometimes I can’t resist either.

Here’s the situation that brought this to the forefront of my mind and inspired this post:  Just two days ago, I was in a meeting with several other technogeeks that I work with.  There were five of us, and were talking about architectural decisions.  For some reason, the topic of TDD came up.  Now, I’ve actually talked about this exact situation before, and I even specifically mentioned TDD in that post.  I also mentioned my good friend and co-worker, and he happened to be in that meeting.  Perhaps I didn’t mention it, but he’s also my boss (everyone in the room’s boss, for that matter).  We don’t usually treat him any differently for all that, but it’s a fact that should not be ignored.

So, suddenly we find ourselves debating the merits of TDD (again).  What those merits are is not important to the story.  Suffice it to say that my friend, and one other co-worker, took the con side, and the remaining three of us took the pro side.  And the discussion got heated.  I found myself geting more and more frustrated as I tried to “help” them understand why TDD was so cool.

On the one hand, it made perfect sense that it should upset me so much.  Neither of the fellows on the con side had ever actually tried TDD.  And it was obvious from the statements they made that they didn’t have a very thorough understanding of it.  Them saying it was a bad technique was basically the same as my six-year-old claiming that he’s sure he doesn’t like a food despite the fact he’s never tried it.  It’s just silly, and therefore somewhat maddening.

But, on the other hand, I have to be careful, because I know how I get, because of my fault.  Here are people making a mistake: they’re espousing an opinion based on incomplete information and zero experience.  And, trust me: even if your opinion happens to be accidentally right, that’s still a mistake.  So, I see people making a mistake and I want to help them.  And I know that’s going to blind me to common sense.  (Well, I know it now ... seeing that at the time was pretty much a lost cause.)

And, here’s the thing: the other two people on the pro side didn’t get into the argument.  Why not?  Is it because they were scared to get into it with the guy who’s technically their boss?  No, not at all: we’ve all had technical discussions where we’ve been on the other side from our boss, and we don’t back down when we think it’s important.  So maybe they didn’t think it was important, then?  Maybe.  But I think I see a better explanation.

When your own kid tells you he’s not eating the fish because he doesn’t like it, even though you know perfectly well he’s never tried it before, you can get into it with him.  As the parent, it’s your job to teach your children to try new things, not to be close-minded.  If you don’t, who will?  Because, when it’s someone else’s kid telling you he’s not eating the fish, you just nod and go “okay, sure, kid, whatever you say.”  Because, and here’s the crux of the matter: why the hell do you care?

These other two guys are both younger than me, but they’re apparently much smarter.  The fact that our two colleagues are radically misinformed about TDD and think it’s bad even though they don’t understand it isn’t hurting them one whit.  It’s not stopping them from using TDD: the boss has said he doesn’t believe in it, but he certainly hasn’t banned it or anything.  In fact, he’s been supportive of other people using it.  So why bother to get into it?  Let the unbelievers unbelieve, if that’s their thing.

At the end of the day, who really gives a fuck?

Apparently I do.  Apparently I have this burning desire to convert all the non-believers and help them see the light.  And here’s where we fetch up against today’s blog post title: I just don’t have energy for that shit any more.  I’m looking at myself doing it and thinking, “why oh why am I even bothering?”  It’s not like these guys are thanking me for my “help.”  No, they’re just irked at my stubborn insistence.  And who can blame them?  ‘Cause, as I mentioned above, the longer this goes on, the more of a jerk I am about it.  So, here I am, pissing off people that I care about, over something that really doesn’t make that much difference in my life, just so I can say to myself afterwards that I corrected a misperception.  Seriously: what the hell am I doing?

I really am too old for this shit.  I need to learn to let go.  Today, when I logged into my work computer, it presented a pithy saying to me, as it always does.  I mentioned previously that I’ve customized these quotes, so mostly they’re familiar, but every once in a while it surprises me and hits with something I’ve forgotten, or something that’s just eerily appropriate.  Today it was both.

The aim of an argument or discussion should be progress, not victory.

    — Joseph Joubert


Yeah, good advice.  I think I’d forgotten it, somehow.  I need to try to remember that, next time I have this burning desire to “fix” somebody else’s “wrong” notions.  I’m going about it all wrong, I think.  And my family has a history of high blood pressure, so I need to chill the fuck out.

Ommmmmmmm ...









Sunday, August 5, 2012

Delays and Excuses


So remember how I said last week was a reading week?  Well, I’m still reading.  I went all the way back to the beginning, and it takes time to work through all that text.  Man, I wrote a lot.  I could probably use a good editor.  Except that she (or he) would probably cross out all my adverbs, and that would just piss me off.

So I’m not ready to present a new semi-chapter of my ongoing book.  My next thought was to fall back on a technical blog, but I’m not ready there either.  I’ve got a couple of really good ideas, but I’ve not had the time to work on them sufficiently to make them ready for blogination.  In at least one case, I think I could actually slap together a CPAN module, which would be pretty exciting.  Of course, to do that, I’d probably need to finish my Dist::Zilla customizations which I’ve been working on forever—well, I don’t need to, per se, but it would be more convenient, and I really want to finish that anyway.  Except, I got stuck on this other thing that I wanted to do for that, and I ended up making a suggestion to another CPAN author and then I agreed to do the thing with the thing and ...

Sometimes I worry that I’m too much of a perfectionist.  I do like things to be just right.  Sort of like Tolkien was ... or at least, like what the stuff I’ve read about Tolkien indicates that he was.  He always wanted to create just one more grammatical construct in Elven, detail just one more century of Númenorean history, retranslate just one more line of Beowulf ... so much so that he had difficulty fininshing things, at least according to some.  Not that I’m claiming to be as brilliant as Tolkien, of course—I still have some modesty—I’m just saying that perhaps I feel his pain.

I’ve often been told that Meg Whitman was fond of saying that ”‘perfect’ is the enemy of ‘good enough.’”  To which my response is, generally, “perhaps, but ‘good enough’ is often the enemy of ‘we’d like to have it last for a while instead of falling apart due to shoddy craftmanship which was deemed “good enough” at the time.’”  Still, there’s no doubt that Meg’s formulation is pithier than mine, so probably hers is more true.

That was a bit of sarcasm there.  Sorry.

Still, one can’t deny that she (or, if we want to be pedantic about it, Voltaire, who originally said “Le mieux est l’ennemi du bien”) has a point.  As you may guess from my previous posts, I think the truth lies somewhere in the middle.  But the tricky part is knowing where to draw the line.

Today I’m leaning a bit more toward the “perfect” than the “good enough.”  Although, one could make the argument that, in settling for this particular blog post (which is about a third as long as I normally strive for), I’m actually taking a pretty firm stance on the “good enough” side.  But mainly I’m saying I want a little more time to polish things.

Also, I’ve been putting in an unusual number of work hours lately, and that ain’t helping.  Plus ... I ran out gas.  I had a flat tire.  I didn’t have enough money for cab fare.  My tux didn’t come back from the cleaners.  An old friend came in from out of town.  Someone stole my car.  There was an earthquake.  A terrible flood.  Locusts!

Or, er, something like that.  Yeah, that’s the ticket.*





* Eek! Stop me before I cross-reference again!**

** Too late: SNL trifecta.

Sunday, July 1, 2012

A Mistaken Hue



A ship in a harbor is safe, but that’s not what ships were built for.


This is one of the earliest quotes I can remember being inspired by.  Like many quotes, its attribution is uncertain; when I first came across it, in a calendar I bought at the college bookstore my freshman year, it was ascribed to that perennial wit, Anoymous.  Then I found out that it was said by someone really famous (undoubtedly either Voltaire or Mark Twain), and then that it was uttered by Willaim Shedd (whoever that is).  Now that I check again, Wikiquote tells me it’s a quote from John Augustus Shedd, from his classic tome Salt from My Attic.  Which is apparently a book so obscure that some people question its very existence.

But no matter.  The quote is a good one, regardless of who said it.  It’s simple, direct, and evocative.  I immediately interpreted it to be a reference to matters of the heart, but of course I was young and stupid then (and, as it happens, in love with someone who didn’t return my affections).  So of course I would see the romantic side of this quote.

And yet ... this quote can be interpreted so much more broadly.  It can be a metaphor for the folly of playing it safe, in life in general.  Perhaps you’ve seen some variation on this old chestnut:

If I had my life to live over, I would try to make more mistakes.  I would relax.  I would be sillier than I have been this trip.  I know of very few things that I would take seriously.  I would be less hygienic.  I would go more places.  I would climb more mountains and swim more rivers.  I would eat more ice cream and less bran.

I would have more actual troubles and fewer imaginary troubles.

You see, I have been one of those fellows who live prudently and sanely, hour after hour, day after day.  Oh, I have had my moments.  But if I had it to do over again, I would have more of them—a lot more.  I never go anywhere without a thermometer, a gargle, a raincoat and a parachute.  If I had it to do over, I would travel lighter.
:
:
If I had my life to live over, I would start barefooted a little earlier in the spring and stay that way a little later in the fall.  I would play hooky more.  I would shoot more paper wads at my teachers.  I would have more dogs.  I would keep later hours.  I’d have more sweethearts.

I would fish more.  I would go to more circuses.  I would go to more dances.  I would ride on more merry-go-rounds.  I would be carefree as long as I could, or at least until I got some care—instead of having my cares in advance.


As it turns out, this was not written by the mythical 85-year-old “Nadine Stair,” nor is it an English translation of a Spanish poem by Jorge Luis Borges.  It’s actually a piece from the Reader’s Digest (which makes sense, given the tenor), written by a 64-year-old named Don Herold.  Again, though, it’s irrelevant who wrote it: does it ring true?  Does it say something worth listening to?  I think perhaps it does.  I think it tells us to take the ship out of the harbor.

Here’s another, different version.  When I get a movie on DVD, I often watch the “special features,” which my eldest used to call the “great theaters” (when he was much younger, of course).  Watching the Great Theaters on a DVD is one of my habits that most of my family could care less about; generally they all get up and leave the room while I check out all the behind-the-scenes info on the making of the cinematic magic.  Often I do this whether I particularly liked the movie or not; sometimes I even find the making-of bits (or the bloopers, or the deconstructions of the stunts and special effects) more entertaining than the movie itself.

But I digress.  The point is, when I first watched Bend it Like Beckham (which I actually did enjoy), I watched the Great Theaters.  All of them.  The movie is about a British girl of Indian heritage, and her father is played by Anupam Kher, who’s a rather famous Bollywood actor.  Throughout the Great Theaters, he kept saying this quote over and over again, using slightly different words, because he felt it summed up the spirit of the movie so well.  I’m sure he was quoting someone else, but I’ll give him the credit, since he’s the one who burned it into my brain.  Here’s my favorite of the several different ways he phrased it:

If you try, you risk failure.  If you don’t, you ensure it.


I rather like this, because it takes the original quote and steps it up a notch.  Now it’s not just a missed opportunity you’re stuck with if you don’t risk taking the ship out of the harbor.  You’re actually failing by failing to move.  You’ve not only gained nothing, you’ve lost everything.  You think you’re staying out of the game by refusing to play, but you’re not: you’re forfeiting.

Anupam Kher gives us the short version.  If you’d like it spelled out a bit more clearly for you, how about we listen to Benjamin Hooks, executive director of the NAACP from 1977 to 1992:

The tragedy in life doesn’t lie in not reaching our goals.  The tragedy lies in having no goals to reach.  It isn’t a calamity to die with dreams unfulfilled.  It is a calamity not to dream.  It is not a disaster not to capture your ideal.  It is a disaster to have no ideal to capture.  It is not a disgrace to reach for the stars and fail.  It is a disgrace not to try.  Failure is no sin.  Low aim is a sin.


Hooks was a Baptist minister and a lawyer, so I tend to trust the man when he talks about sin.

I often say that I am a romantic, despite the fact that I’m a cynic (a dichotomy to which I should really devote its own blog post).  This is one of the expressions of that outlook.  I will continue to write my novel even though I’m far too old to become a famous writer (although of course Stieg Larsson is always an inspiration—hopefully I won’t need to die first, as Larsson did).  I will continue to demand a work environment where I can relax and have fun even though it’s “unrealistic” to expect a business to be run that way (never mind that I myself ran a business exactly that way for 12 years).  I will continue to encourage my children to follow their own dreams, even if those dreams are completely ineffectual ways to earn a living.  Because, as Robert Browning tells us:

Ah, but a man’s reach should exceed his grasp,
Or what’s a heaven for?

Sunday, March 11, 2012

Relativistic Absolution


I have a horror of absolute statements.

It might even be a phobia, now that I ponder it.  It starts with my experience of certain people: my father was fond of absolute statements, as was the first person I took on as a partner after I started my own company.  Both of these people have something in common: they believe that if you state something with enough confidence, people will believe you.  It didn’t much matter whether the something was actually true or not.  This actually works, sort of, especially on strangers.  Unfortunately, people that have to listen to you on a regular basis quickly learn that the more confident you are (and the more absolute your statement is) the more likely you are to be full of shit.

So I myself learned to be more cautious when I state things.  With the result that many folks (including some of my closest friends) think I’m “wishy-washy.”  I dunno; maybe I am.  I certainly don’t like to be wrong, although I think many people think I feel that way because of pride, or a need for superiority.  The truth is, I just feel bad when I’m wrong.  If I tell you something, and then it turns out I was wrong, I’ve misled you.  That makes me feel crappy.  You came to me for information (and, the older I get, the more that happens, obviously), and here I went and told you the wrong thing.  Makes me feel like a right bastard.

In addition, my whole philosophy of life reinforces the concept that absolutism is useless.  Again and again in this blog I’ve talked about how I believe in two competing things at once: from my initial post on what I (only half-jokingly) mean when I claim to be a Baladocian, to paradoxical views on reality and perception, semantics, uncertainty, quotes, parenting, hype, and grammar.  (Wow, that list was even longer than I thought it was going to be when I started to write it.)  With that many posts about how two seemingly contradictory ideas can both be simultaneously true, is it any wonder that I tend to stay away from statements that pretend there’s only One True Way to view the world?

But if I had to pick one single reason why I don’t believe in absolute statements it would certainly have to come back to ... a book.  Now, there are five books which I think of as having changed my life.  Four of them are fiction: Stranger in a Strange Land, Cat’s Cradle, Legion, and The Dispossessed.  None of these are perfect—charges of sexism against Heinlein are mostly true, and Blatty’s books require a strong stomach in places—but each of them caused some fundamental shift in how I viewed the world.  The characters of Valentine Michael Smith, John (a.k.a. Jonah), Lt. Kinderman, and Shevek all have something in common: they are all thrown into strange settings (Earth, San Lorenzo, a supernatural murder, Urras) and their attempts to grapple with the bizareness they’ve been thrust into generate philosophical ramblings in addition to essential plot points.  The plots of these books are very good, but that’s not why I list them here; in terms of sheer plot, there are many other books I like better.  No, it’s the philosophical ramblings that are the important bits.  Smith’s handling of money and religion, Kinderman’s views on the impossibility of evolution, John’s exploration of truth and lies, Shevek’s reflection on language and possessions ... these are the aspects which challenged my worldview and caused it to shift, sometimes in large ways, sometimes in small.

But perhaps none of these shook up my brain patterns as much as Quantum Psychology, a book by “science fiction” author Robert Anton Wilson.  I put the term “science fiction” in quotes, because, although some of what RAW (as he’s often affectionately known) writes is definitely science fiction, much of it can’t be categorized so simplistically, and quite a lot of it (including Quantum Psychology) isn’t really fiction at all.  In fact, Quantum Psychology reads like a textbook ... but a textbook for a class like no class you’ve ever taken before, nor are particularly likely to, for that matter.  I find it difficult to believe that quantum psychology has ever been taught in a college setting, even in the most liberal of institutions.

And yet, after reading it, you’ll wonder why not.  Well, you’ll also know why not—primarily because few teachers could present it and few students would “get” it—but you’ll still marvel that we don’t all have to learn this stuff.  At least I’m pretty sure you will.  I know there are people who are simply not wired to handle this sort of introspection, and, if you happen to be such a person, I fancy you’ll proclaim it to be pretentious tripe.  And that’s no reflection on you personally.  Maybe one day in the future it would make more sense.  Or maybe you can’t get past RAW’s dismissive stance on the world’s religions (in the same way that staunch feminists will have serious problems looking past Heinlein’s rather primitive portrayal of women in Stranger in a Strange Land).  Or maybe you just don’t care to dissect the universe that much.  That’s okay.  As always, I refer you to the masthead.

But if you’re the sort of person who’s bothered to read this far (which of course you must be) I bet you would find QP just as fascinating as I did.  Now, there are many vital concepts to be learned from this book, but one of the most fundamental is also (perhaps unsurprisingly) one of the earliest presented: E-prime.  I’ll let Wilson explain it:

In 1933, in Science and Sanity, Alfred Korzybski proposed that we should abolish the “is of identity” from the English language.  (The “is of identity” takes the form X is a Y, e.g., “Joe is a Communist,” “Mary is a dumb file-clerk,” “The universe is a giant machine,” etc.)  In 1949, D. David Bourland Jr. proposed the abolition of all forms of the words “is” or “to be” and the Bourland proposal (English without “isness”) he called E-Prime, or English-Prime.

Okay, that’s what it is ... but what’s the point of it all?

The case for using E-Prime rests on the simple proposition that “isness” sets the brain into a medieval Aristotelian framework and makes it impossible to understand modern problems and opportunities.  ...  Removing “isness” and writing/thinking only and always in operational/existential language sets us, conversely, in a modern universe where we can successfully deal with modern issues.

Okay, so the problem appears to be with our friend (and nemesis) Aristotle again.  Remember him from the balance and paradox discussion?  He’s the fellow who told us there were four elements (when there weren’t), and five senses (when there weren’t), and two possible truth values ... when we know the world is more complicated than that.  Well, it turns out that Aristotle had another potentially problematic habit: that of describing how the world actually “is.”  Or, as RAW puts it, “the weakness of Aristotelian ‘isness’ or ‘whatness’ statements lies in their assumption of indwelling ‘thingness.’”  But the truth is, again, more complicated.  If you think about it, it doesn’t actually make any sense to talk about what something “is.”  We can talk about things we’ve seen, or otherwise experienced, or we can talk about our opinions on the world or the things in it, or we can talk about how things act, or how we remember they acted.  But what something “is”?  Once you let go of your Aristotlean prejudices, it doesn’t actually make any sense.

RAW givs us a few examples of where “is” can lead us astray.  “That is a fascist idea.”  As long as the proposition is put thus, it’s bound to lead us into an argument.  We could fight over the technical definition of “fascist,” or we could argue about the intentions and/or beliefs of the person who came up with the idea, or we could debate about whether people’s perceptions on whether or not it’s fascist override any consideration of whether it actually is fascist.  Now, what if we restate the proposition in E-Prime?  “That seems like a fascist idea to me.”  Well, not much to argue about there, is there?  I could claim you’re lying, I suppose, but honestly: why bother?  If it seems like a fascist idea to you, okay.  It doesn’t seem like a fascist idea to me.  Glad we had this little chat.

So, see how “that is a fascist idea” is an absolute statement, while “that seems like a fascist idea to me” is properly qualified?  And also how the absolute statement is problematic, while the qualified one is just fine?

I could go on (as RAW does), but just think about it.  Think about the last time you had an argument with someone, and see if the word “is” wasn’t intimately involved somehow.  “That is a very bad idea.”  “Republicans are all in the pocket of big business.”  “Gay marriage is destroying American family values.”  “Religion is the opiate of the masses.”  “This movie you recommended is crap.”  “You are so frustrating sometimes!”  The “is” is the part that makes it an absolute statement, and the worst part about that sort of absolute statement is that it involves us making judgement calls for things we can’t possibly back up, stating opinions as facts, and describing the very essence of things, when the nature of the universe mandates that all reality is mediated by our senses, so that the best understanding we can ever achieve is still just a mental picture of that reality.

Now, note that I don’t actually write in E-Prime—neither in general, nor even in this particular post.  In fact, go back and look for the places where I’ve used “is” (or “are” or whatnot) and notice how those statements are the very ones that provoke you, that are confrontational, that make assertions that I can’t actually prove and challenge you to apply your brain instead of just accepting whatever I say at face value.  If I had written this entire post in E-Prime, that would have made it very difficult for you to disagree with anything I said.  But maybe I wanted you to disagree.  Maybe I wanted to shake you up and make you think.

So, even though I think that E-Prime is a fundamental concept that everyone should understand, I personally believe that not using E-Prime has some value as well.  But, of course, that’s just my opinion.

Sunday, February 19, 2012

Amor Fati


I seldom end up where I wanted to go, but almost always end up where I need to be.
        — Douglas Adams

Some people believe in destiny.  The idea that the threads of our lives are woven together in a tangled skein is an attractive one, and reappears throughout history: from the Moirai of the Greeks and the Norns of the Vikings to the Wheel of Time in Robert Jordan’s series of the same name, which gives us the quote “The Wheel weaves as the Wheel wills, and we are only the thread of the Pattern.”  The reason this concept is so tempting is that it accords with our experience of the world.  If you stop and think back on your life, you’ll see a hundred different coincidences, a hundred different times where, if one thing had gone only slightly differently, your whole life would be in a different course.  In fact, looking back on one’s life at all the little things that had to go just so to lead you to where you are now, it’s enough to make anyone ponder whether there might be something to this concept: call it fate, destiny, fortune, karma, kismet, call it random chance or divine providence, say que sera, sera, or say the Lord works in mysterious ways his wonders to perform, or say the universe puts us in the places we need to be, but any way you slice it, it’s hard to pretend there’s nothing behind the curtain.

For instance, say I had not dropped out of college: then I wouldn’t have gotten my first job as a computer programmer.  I might have become one later in life, maybe, but it wouldn’t have been the same.  Say I had not accepted the offer to leave that job to form a two-man company with one of my former co-workers, which only lasted a few months ... well, then, I might never have ended up going back to school to finish up my degree.  I know for a fact that if I had not accepted an invitation from a friend of mine attending college in the DC area to come spend a week with him that I never would have moved to our nation’s capital, where I spent 18 years of my life.  I know this because I had already applied (and been accepted) to another college; it just so happened that I had missed the deadline for fall admission at the college of my choice and I was going to have to wait until the following spring.  But this school my friend was attending still had spots open—not for freshmen, but, then, I was a transfer—and a surprisingly decent English program, and so it became my alma mater.

And that’s just the beginning.

Somewhere out there in the wide world is a woman whose name I can’t remember, born in Hawaii, with the dark skin and exotic beauty to prove it.  She went to high school in Los Angeles, and her sister (or her cousin, or her best friend—I forget) went out with one of the guys from Jane’s Addiction.  Somehow she ended up moving across the entire country, and wound up in Fairfax, in Northern Virginia, just outside DC, working at a cheesy little college pub.  And, if she had not come out of the back room that day, and had she not been so pretty, and had she not smiled just so, and had she not looked at me and my friend and said “two applications, then?” ... if all that confluence of chance had not come together at that exact moment in my life, when I was just giving my friend a ride around to various restaurants so he could find a job as a cook, since it just so happened that he didn’t have a car, and just after an exhausting two or three weeks wherein I learned that my experience was enough to get me any number of programming jobs, but there was apparently no such thing as a part-time programming job (at least not in that place at that time) ... if all that chaos theory had not converged on that exact moment in time, would have I cut off my friend’s “no, just one” with a resigned “what the hell, sure, two applications”?  Probably not.  And if I had never taken that job, I would have never engaged in the childish electronic prank that introduced me to the computer salesman who became my first business partner, which eventually led to my starting my first company, which eventually got me a consulting job at large corporation, where I eventually met the woman who is my partner to this day, and who is the mother of my children, who are essentially the entire point of my existence.

That’s a lot of “coincidences.”

When business for my company dried up, and my meager savings was running out, another friend of mine just happened to mention a job that he had interviewed for but had decided not to take, but mentioned I might like it there.  Turns out I did, and I spent three and half years there, meeting some folks who are still some of my favorite people of all time, and having a really great job where I got to learn a lot of stuff, and teach a few things, and have a great deal of freedom, which was important, because I was coming off of working for myself for 13 years, and I’d utterly lost the ability to wake up early (not that I’d ever really had it, for the most part), or wear shoes at work, and I had 13 years worth of ponytail between my shoulder blades.

The story of how I left that job and came to the great state of California is yet another of those sets of bizarre, interlocking coincidences.  Last week I told you what I thought of corporate managers telling you you must take PTO when you’re slightly sick and you want to work from home.  As Bill Cosby once said, I told you that story so I could tell you this one.  I’m not going use any names here: if you know me, you most likely know the person I’m talking about, and if you don’t know me, you most likely wouldn’t recognize the name anyway.

When I first started at this job I’m talking about, the first job after running my own company for 13 years, I had a boss who lived in Boston and showed up for a couple of days every other week.  Despite not being around very often, this person was one of the best bosses I’ve ever had.  I was given very clear directions, never micromanaged, trusted, encouraged ... the only criticism I ever got from this boss was to step up my game, to take more responsibility, to stop worrying about stepping on anyone’s toes and take the lead on things.  This company was a subsidiary of a larger, public corporation, but our boss kept us insulated from any politics and let us do our own thing.  There was only one layer between our boss and the corporate CEO, and that VP and our boss seemed to get along just fine.

Then the synchronicity dominoes started to fall.  The VP left, and was replaced by a real asshole of a human being, one of those corporate jackasses who believes that being a jerk is a substitute for leadership.  In less than a year, the replacement was gone as well, apparently unliked by everyone, including the CEO, but it was too late: my boss had also submitted a resignation, and I was destined to receive a new manager, who would end up being one of the worst bosses I’ve ever had.  And I once worked for a twitchy Vietnam vet with a bad coke habit.

This new boss was a micromanager, never trusted, didn’t understand how to encourage and pushed bullishly instead, had no respect for the culture of the company, and basically ticked off every mistake that a corporate middle manager can possibly make.  It was like this person had a manual to go by:  Sow distrust and dissension among employees? Check.  Freak out and yell at people in front of co-workers? Check.  React to problems by increasing the number of useless meetings? Check.  I swear, somewhere out there is a book that tells these people exactly how to act, because the number of them who all do the same stupid things over and over again can’t be explained any other way.

It was Memorial Day weekend of 2007.  I was feeling a bit under the weather, but there was a big project going on at work that I knew we’d all regret if I fell behind on.  This new boss wasn’t my favorite person, but I still loved the company, and I wanted to do my best to make the (completely artificial) deadline.  That Friday, I sent my email saying I wasn’t feeling well, but I was going to soldier on.  Then I got to coding.  When I checked again, on the holiday itself, I discovered a snarky email from my boss, advising me that if I was sick, I should take PTO and not work from home.

I promptly replied that I was deeply sorry that I had attempted to make progress on our big project, and I assured my boss that it wouldn’t happen again.

I then went to check my spam folder, because that’s where all the recruiter emails invariably end up.

If you’re a technogeek like me, you know that once that very first recruiter finds you, there will follow a never-ending stream of offers for jobs in your specialty, jobs not in your specialty, jobs nowhere near the vicinity of your specialty, and non-specific vague pretensions of maybe possibly having a job for you one day so they’d just like to stay in touch.  Mostly you just ignore them ... until you get ticked off with your current work.  Then you realize that you’re sitting on a gold mine, tucked away in your spam folder.

I had always lived on the East Coast: 22 years in Tidewater, on the VA-NC border; 1 year in Columbia, SC; and the aforementioned 13 years in the greater DC metro area (partly in Northern VA and partly in Southern MD).  But if anyone asked me where I really wanted to live, I always said California.  I later expanded to the West Coast in general: Oregon is lovely (although, as it turned out, practically impossible to find a tech job in), and Washington is not a bad choice either (lots of tech jobs, but perhaps a bit colder than I’d ideally like).  But really it was California that had caught my interest; two trips to Borland out in Scott’s Valley and a couple of visits to San Francisco to visit an architect-turned-tech-entrepreneur friend of mine had cemented Cali—and the San Fran-San Jose corridor in particular—as the place to be.  So when I went looking for recruiter spam, I figured I might as well find something that said “California” on it.

There were only 3 or 4 recruiter emails, as it turned out ... a light dusting compared to what I normally had.  One of them said “Santa Monica, CA.”

Now, I didn’t know where Santa Monica was.  And I was too much in a huff to look it up.  But I knew where Santa Clara was, and I knew where Santa Cruz was, and I figured ... how much farther away could it be?

Pretty far, as it turns out.  Santa Monica is in Los Angeles county, and is (along with Venice Beach and Marina del Rey) one of the beach cities of LA.  As it turns out, my partner used to live in (or just outside) Santa Monica.  All that I was to find out later, though.

It was Monday (Memorial Day) that I sent a random email back to a random recruiter that I plucked out of a spam folder; on Tuesday, I got a garbled message from someone with an unintelligible accent—on a hunch, I called back that same recruiter and it turned out to be him; on Wednesday, I was talking to the recruiter’s boss, who was telling me about a company which had very high standards and was willing to pay full relocation; on Thursday, I had a phone interview with the folks who would eventually end up being my new bosses—this was conducted on my cell phone, while I was driving through the middle of downtown DC, trying to avoid the hideous traffic on the Wilson Bridge; on Friday, I was talking to someone at eBay corporate about a plane ticket; the following Monday night I got on a plane; Tuesday, I had what was possibly the best job interview of my career (probably second only to the one at the corporation where I met my partner), and they made me an offer on the spot; on Wednesday, I received a signed offer letter in my email; and on Thursday, I handed my boss a brief resgination letter.  So, to wrap up the discussion from last week, that’s under two weeks from the time my corporate middle-manager boss pissed me off over something stupidly trivial until the time I had a better job for about 25% more money (although, admittedly, part of that was simply to cover the higher cost of living in LA), and my old company lost 3 and half years’ experience and half their tech department.  Something for you corporate folks to chew on.

But the real lesson is, as far as I’m concerned (and as far as my family is concerned), when something is meant to happen, it will happen, and often with blinding speed.  I could tell you the story of our new house, for instance, which includes passing on it when it was overpriced, it disappearing from the market and then, strangely, reappearing for a cheaper price, and even a prophetic dream ... but I’ve babbled on for quite a while already.  No need to beat a dead horse, I think.

I’ve long felt that whatever force runs the universe, be it divine, karmic, quantum, or ontological, be it moral, predestined, anthropomorphic, cyclical, or merely mechanical, has been quietly and efficiently doing His/Her/Its job for me, or on me, putting me where I am today and seemingly with the inexorable goal of geting me to where I will be tomorrow.  As you can see, I’m an epistemological conservative, but still I can’t help but believe: all that effort that whoever/whatever puts into seeing me to my assigned place ... that’s a lot of pointless expended energy, if there really is no purpose behind it.

Something to think about, anyway.

Sunday, October 2, 2011

Proscription Drugs


I believe that we, as human beings, like to simplify things.

The truth is, we live in a complex world.  The laws of physics that we know about are far beyond what most of us can comprehend, and most physicists agree that we don’t know all of them yet.  The intricacies of the human body are no less baffling to all but the most learned biochemists and neurologists and geneticists, and, there again, there are still mysteries which counfound even them.  History is full of factual ambiguity; philosophy is full of moral ambiguity; literature is full of contextual ambiguity ... is it any wonder that we need to find a way to reduce things, simply to cope with living in the universe we find ourselves in?

Of course, the danger when simplifying is that we may oversimplify.  I’ve discussed before how we “know” that there is no black and white in the world, and yet stubbornly persist on perceiving most things in absolute terms such as “true” and “false.”  (In fact, you might even go far as to say our view of balance is itself a paradox.  But that’s straying too far afield from my point.)  Let’s take a field at random ... oh, let’s say ... English grammar.

How many of you out there know that it is wrong to split an infinitive?  Go on, raise your hands proudly and be counted.  You know the rules of grammar, right?  You were taught this stuff in school.  Splitting infinitives is just one of those things which is downwright wrong.

Of course, “right” and “wrong” would be just like “black” and “white” ... right?  And we know there’s no black and white in the world ... right?

Now let me ask you this: for those of you who didn’t raise your hand about the split infinitive being wrong, why not?  Did you trot out that chestnut about the English language contantly evolving?  Don’t get me wrong, that’s true, but what it implies is that splitting infinitives used to be wrong, but now it’s okay.  And I’m not sure I agree with that.

Wikipedia, of course, is pleased to present us with a history of the issue, and the executive precis is that not only is there no rule against splitting infinitives today, there never has been.  Some folks came along and said they didn’t like it, and gave some great examples of instances where it really is quite awful to do.  But somehow we took “here’s a technique which is often abused and needs to be carefully examined” and turned it into “never do this!”  We oversimplified.

What brought this to my mind today was reading an online post from someone (whom I greatly respect) who dismissed a suggested wording change because it used the passive voice.  And we all know that passive voice is wrong, don’t we?  After all, Microsoft Word marks it as a grammar error, so it must be wrong.  Except it’s not.  Passive voice isn’t wrong.  It can be used very poorly, I’ll grant you that ... but isn’t that true of practically any grammatical construction?

This one in particular dates to the classic Strunk & White.  They gave us all sorts of great advice on how to write more clearly.  Except that most of it was pretty bad advice, unfortunately.  And, if you’re not the sort of person who’s so inclined to click on perfectly good links that I drop into my blog posts, let me quote you the most important sentence of the article, at least as regards the proscription on passive voice: “Of the four pairs of examples offered to show readers what to avoid and how to correct it, a staggering three out of the four are mistaken diagnoses.”  That’s right folks: in the section of Strunk & White that tells you why you shouldn’t be using the passive voice, only 25% of their “bad examples” are even passive themselves.  And this is a book that many people regard as definitive, in terms of grammatical correctness!

But, regardless of the correctness of the examples, the point is that even Strunk & White don’t say “passive voice is wrong.”  They say “it should be avoided, wherever possible.”  If you want my opinion, even that’s too strong a statement, but let’s overlook that for now.  How did we get to the point where, in a discussion about what the best wording for something might be, the very thought of using a passive voice construction is dismissed with such casual prejudice?  Not even worthy of consideration?

In another discussion (same web site, different interlocutor, far less respect), someone chastised me for ending a sentence with a preposition.  I cheerfully responded with the quote, commonly attributed to Winston Churchill (although most likely apocryphally), that that was “nonsense up with which I would not put.”  The response, given in some distress, was that Churchill was known to suffer from “mental illness” (which is utterly irrelevant, of course, whether true or not), followed by a plea to “save the language.”

Seriously?

Ending a sentence with a preposition is not only incontrovertibly wrong, but so utterly wrong as to spell the doom of the English language as a whole?

No, unfortunately, it’s not even wrong at all.  This “rule” stems from a fellow named Robert Lowth, author of A Short Introduction to English Grammar, and, once again, even he doesn’t say “never do it.”  He says, in fact: “This is an Idiom which our language is strongly inclined to; it prevails in common conversation, and suits very well with the familiar style in writing; but the placing of the Preposition before the Relative is more graceful, as well as more perspicuous; and agrees much better with the solemn and elevated Style.”  See?  Not “wrong.”  Just “sounds better the other way.”  In his opinion.  As a clergyman.  Who wrote “an Idiom which our language is strongly inclined to” in a sentence about not ending things with prepositions.

I could even point you to several other lists of mythical grammatical rules such as these, as well as many others (don’t start a sentence with a conjunction, never use double negatives, etc), but the point is that, even when the proscription doesn’t reach the level of “rule,” we still can’t resist stating it as an absolute.

Let’s take the case of adverbs.  Mark Twain says “I am dead to adverbs ... they mean absolutely nothing to me.”  Graham Greene called them “beastly” and said they were “far more damaging to a writer than an adjective.”  Elmore Leonard has started a “War on Adverbs” and says “to use an adverb this way (or almost any way) is a mortal sin”; Stephen King apparently concurs when he notes that “the road to hell is paved with adverbs” and that by the time “you see them for the weeds they really are” it’s too late.  Because of these types of opinions, any number of web sites will tell you that you should never use adverbs or that you should ruthlessly expunge all of them from your prose.

Of course “never” is an adverb, as is “ruthlessly.”

For that matter, all four of the authors I quoted above, railing against adverbs, use adverbs themselves ... in fact, there are adverbs in all four quotes.  As with the proscription against the passive voice, the first problem with advising people to get rid of all their adverbs is that most people can’t identify them.  “Very” is an adverb, as is “always,” or “far,” or “sometimes,” or even “not.”  Imagine trying to write a piece of prose of any appreciable length without using the word “not.”  No doubt you could do it, as an exercise, but it would be painful, and your piece would most likely sound tortured in at least a couple of places.

Getting rid of all adverbs is such a patently ridiculous idea that some of the smarter know-it-alls have scaled back their advice.  “Not all the adverbs,” they hasten to clarify.  “Just the -ly ones.”  So, you know, just get rid of all those ”-ly” words.  Like, you know: friendly, silly, lovely, beastly, deathly.  Those sorts.

Except those are all adjectives.

Yes, that’s right: when J.K. Rowling was criticized for an overuse of adverbs, for the sin of putting one right there in the title of her final Harry Potter book, it was a bit of an embarrasment to realize that “Deathly” was actually an adjective, modifying the noun “Hallows.”  At least I hope that author had the good grace to be embarrassed over the faux pas.

In some cases the advice gets watered down to the point where people tell you to get rid of all your adverbs that end in -ly unless they make the sentence better.  But, at that point, the advice has little to do with adverbs, and should instead apply to every word in your prose.

Personally, I love adverbs.  Sure, overuse of them is bad.  Overuse of anything is bad: that’s built into the definition of “overuse.”  Blanket statements about expunging them (ruthlessly or not) are just moronic (even if they do come from one of my most treasured literary idols).

But, as always, it is our human nature to want to simplify the “rule” to make it easier to remember.  What’s simpler? “don’t overuse adverbs, or use them in cases where a stronger verb would serve the purpose equally well, or use them redundantly, or attach them too often to ‘he said’ tags”? or “don’t use adverbs”?  What’s easier to teach: “don’t split an infinitive when the number or quality of the words between the ‘to’ and the verb cause the infinitive itself to be weakened,” or “never split an infinitive”?  What’s the cleaner aphorism: “don’t use the passive voice when the agent is known and the active voice is stronger, unless you specifically want to de-emphasize the agent, but not merely as a means to avoid responsibility for the agent or to pretend that there is no agent at all” or “don’t use passive voice because MS Word underlines it in green”?

And so we take a complex but useful piece of advice and turn it into something simple and profoundly useless.  We take a reasoned approach that glories in balance (and occasionally even paradox) and make it black and white: do this, don’t do that.

It makes it much easier to be able to correct other people with all our mistaken impressions.