You are currently browsing the tag archive for the ‘Science’ tag.

NoGISI’ve got a brother who lives in Connecticut, not far from New York.  I visited him not too long after September 11th, 2001, for no particular reason.  While I was there, a 9/11 benefit concert was held in New York, and we watched it live on television.  We watched a variety of performers come and go, as well as the audience’s varying reactions to them.  Toward the end of the concert, The Who (one of my favorite bands) got up to play.  They played Won’t Get Fooled Again and Baba O’Reilly.

And the audience went nuts.  They yelled and screamed and punched the air and waved their flags and laughed and cried.  They cheered themselves hoarse for a band they believed understood their pent-up national pain and anger.  They cheered for their love of country and their faith in the future.  They cheered for America the Beautiful and for four British boys who seemed to understand.

I sat in my brother’s armchair, drinking a beer and watching this spectacle in dumbfounded horror.  Halfway through the second song, I jumped up and shouted at the television:

“Aren’t you people listening to the words?!?!”

I’ve been reminded of this fairly often as of late, most every time I encounter a discussion about the NoGIS ‘movement’.  For those of you who are unfamiliar with the catchphrase, NoGIS is a term adopted by many Map Dorks to signify a perceived need for a paradigm shift within the discipline.

As a concept, NoGIS is meaningful and interesting, and its more sober and informed proponents have supplied me with some lively and enjoyable arguments and/or discussions on the subject.  Of course, these are the same people who currently tend to shy away from the term ‘NoGIS’ as being inappropriate and ill-conceived.  The problem is that the term was adopted while the concept itself was still rather nebulous and unformed.  NoGIS was chosen as a nod toward the NoSQL movement, mainly – I think – because it sounded cool.

Anyway, NoGIS reminds me of that 9/11 benefit concert because for every sober and informed proponent of the concept, there are at least a dozen idiots who have no idea what the whole thing is about but have nonetheless jumped on the bandwagon because they couldn’t pass up an opportunity to wave their flag and shout.  People who are afraid that there’s a revolution brewing and are terrified that it might pass them by.  Kind of sad, actually.

Truth is, there’s no revolution.  Nor is there a looming paradigm shift.  What is occurring is a sort of branching of the discipline.  A fork in the road, as it were.  In fact, we arrived at that fork and passed by it some time ago, but it hasn’t been until now that the need has emerged to sit down and really figure out what it means.

Today’s GIS seems to have such different demands that it’s easy to jump to the conclusion that the entire discipline is due for a shake-up.  And it’s not just a question of size – while shuffling around terabytes certainly proposes certain challenges, they’re not terribly different than those presented by shuffling around gigabytes not too long ago.  All other size-related issues fall into a similar category.  While the demands get bigger and bigger, so do our capabilities.

We’re talking about other sorts of change here.  Changes in the primary purpose our data is serving.  Who is using it, how are they using it, and for what purpose?  This is the fork in the road I’m talking about.

A meaningful split occurred at that fork (this is not to imply that there is any sort of divide in the discipline.  We’re all on the same side here).  A large part of the discipline continued happily down the road GIS has been travelling along since its birth, which is why any paradigm shift that happened was not a universal one.

But the new road called for a major reorganization of worldview.  On this new road, the client became the consumer.  The project became the product.  The science of GIS became the business of GIS.

What I’m talking about here is the commoditization of geography.

Yes – it entails it different tool kit, but not a dissimilar one (we are not alone in this – any discipline that has both a theoretical and applied branch has these sorts of differences.  This is most easily seen by comparing how a discipline is practiced in the academy compared to how it is practiced in the public sphere).  And many of the tools do much the same job, but in a different way or to a different degree (a hammer and a pneumatic nailgun both drive nails).

What the flag wavers and shouters don’t seem to be noticing here, though, is that everybody wins.  This fork in the road is a very good thing for GIS.  The more directions we have research and development travelling in, the better off we all are.

As long as we all keep talking to each other.  GIS will continue to travel down both roads (and I hope there will be more to come), and the best thing for our discipline and our selves is to share our advancements so that we can build upon and refine each others’ work.

If we must make distinctions, though, let’s at least do so in a manner that makes sense.  We could apply any number of labels we desire, and many of them would make as much sense as the others.  Personally, I like Theoretical GIS and Applied GIS (I’d like to think which is which is obvious).  They’re fairly descriptive and neither one has any particular negative connotations.

I think it’s about time we drop this NoGIS crap, though.  At the end of the day, we’re all just trying to apply some meaning to geography, or to extract some meaning from it.

And that, my friends, is GIS.

For DummiesI am a pretty smart guy.  The tests that are usually used to measure these  things tend to place me somewhere in the smartest 5% of humanity, depending on the particular test and what kind of day I’m having.  I am also smart enough to know the flaws inherent in these tests and am very much aware that they are not always accurate (unless, of course, you just want to run comparisons of middle-class, white guys of European descent).

So let’s allow for the less-than-perfect nature of intelligence testing.  Let’s say I’m considerably less intelligent than the tests are wont to place me.  For the sake of argument, let’s say that I actually place just inside the smartest 20% of humanity.

This means that every time I initially encounter another human being, there is an 80% chance that they will be dumber than I am.  Although even the most determined moron isn’t stupid all the time, I think if we took the time to crunch all the numbers (and allowing for the relative nature of stupidity), we would end up with something like a solid 20-25% chance that any time another human being opens their mouth in my presence, something stupid will come out of it.

By now you may be thinking that I am arrogant.  While I feel arrogance is too strong a term, I am the first to admit I possess an ego the size of Louisiana.  However, my ego has nothing to do with with my intellect.  Rather, it is a result of my upbringing.  My family took pains to see that I developed a strong self-image.  They did not foresee the monster they would create.

My intelligence, to the contrary, tends more often to have a humbling effect.

The smartest human being I have ever met (and believe me, children – she’s really fucking smart) once explained it to me this way:  The universe is an enormous place full of stuff we don’t know.  Somewhere in that immensity, we live inside miniscule bubbles made up of our knowledge.  When we learn new things, the size of our bubble expands, but the net result of this is that the surface area of our bubble (the interface where our knowledge meets our ignorance) increases.  Therefore, expanding our knowledge exponentially increases our awareness of just how much we don’t know.

This is why those who posses truly superior intellects are usually not prideful about it.  Real intelligence instills humility.  Real intelligence knows that it has arrived where it is through a certain amount of luck and is thankful for it.  And real intelligence knows what it is – it needs no validation.  This is why most people who are truly intelligent view their intelligence as just another physical attribute, like being tall or having blue eyes.

And then there are those who just think they’re smart.   Those who are, in fact, not smart at all, but they believe otherwise because some test or web site or TV show told them otherwise.  To be fair, they probably clock in at the smarter end of mediocrity, but they don’t actually ever cross over into the realm of intelligence.  And stupidity that thinks it’s smart is the most dangerous form of stupidity.

You know the type – there’s no humility in this crowd.  They’re oblivious to the vastness of their ignorance, mainly because they never look up from the shiny baubles of their amassed ‘knowledge’.  They actually believe that they ‘know’ things.  They speak of ‘truth’ and ‘fact’ that is ‘proven’ and ‘undeniable’ as if such things actually exist.  And what really drives them crazy is when someone has the gall to question their so-called ‘knowledge’.  This is when they leap to the attack, and their attack always takes the same form:  they must prove you wrong.  This is the only manner in which they can believe themselves to be right.  The fastest – hell, the only – route to intellectual superiority lies in the ability to point to another human being and convincingly declare: You are wrong! It’s kind of sad, actually.

But here’s the thing that pisses off the genius wannabees the most:  that it is unacceptable in our society to walk up to others and say “I’m really smart”.  I mean, what’s the point of possessing a superior intellect if nobody notices? How will everyone else know they are inferior unless their betters point it out to them?

So the wannabees found themselves in a bit of a pickle.  How can they show off their intellectual superiority without just coming out and saying it?

After applying their mediocre intellects to the matter, they eventually decided that the way to show off their brains was to be annoying.  You know – needlessly correcting grammar.  Obsessing on minute, meaningless detail.  Memorizing acronyms and using the complete term instead.  You’ve been exposed to the behavior.  You’ve probably wanted to knee a groin over it.

Eventually, though, they managed to see through the fog of their mediocrity and noticed that all they were accomplishing was to piss everyone off.  While they may have been exhibiting their superiority, the inferior masses were clearly not ‘getting it’.  A new method was called for, and after much screaming and gnashing of teeth, one member of this ‘intelligentsia’ stood up and said “Um…what about this ‘God’ thing?”

After a brief fight, he managed to clear enough space around himself to offer an explanation:  “I meant that we should profess ourselves as atheists.  Everyone knows religion is for idiots.  If we say we don’t believe in God, everyone will know we’re smart.  And society allows us to go around saying we’re atheists.”

The rest – as they say – is history.  Now the creme de la mediocre have adopted atheism as their own personal religion.  And they cling to a few studies that support their primary idiocy, i.e., ‘smart people tend to be atheists’.

But what the mediocre minds really hate most is me.  I show up and declare my atheism in complete (usually well-constructed) sentences, and they welcome me with open arms.

And then I go and ruin everything by explaining a few things to them.  Like evolution is a belief, not a fact.  Like unbelief is as much a matter of faith as belief.  Like atheism is, in fact, a form of religion, as is science.  And my personal favorite, the one they hate most:

The universe is a really big place.  There’s enough room in it for more than one Truth.

Signature

My formal education and training is in archaeology and history.  Archaeology was my first career-related love, history was more or less a by-product.  Here in the US, most archaeology is actually a sub-discipline of anthropology (but not all.  A discussion for another time), so while my field of study was archaeology, my degree is in anthropology.  Why, you may ask, is archaeology considered a sub-field of anthropology?  The reasons are complicated, but in a nutshell the answer is:  Because Franz Boas said so.

Anyway, that argument aside, modern American anthropology consists of four basic sub-fields:  archaeology, biological anthropology, cultural anthropology and linguistic anthropology.  There is some overlap and intermingling betwixt these sub-fields, but not as much as you would think.

Anthropology is, relatively speaking, a young science.  In the grand scheme of scientific endeavor, it really hasn’t been around that long (compared to – say – astronomy).  Because of this, it has undergone some well-documented changes in a relatively short period of time.  In its earliest manifestation, the field was basically cultural anthropology – a group of dedicated researchers (mostly men) who went into the world to spend some time amid strange peoples and learn their ways.  This often occurred amongst marginal and/or aboriginal peoples, mainly because – let’s face it – white people are boring.

Then Franz and his ilk came along, and they sliced anthropology up into its current major subdivisions (yes – I’m oversimplifying.  I’ve only got so much time here), due mainly to the fact that the field of anthropology was getting larger and more complicated.  So now the budding anthropologist needed to know more.  Now the discipline demanded more of them, namely what was then called four-field competency.  A couple of generations ago, this was something you could expect to find in any given anthropologist.  While they would have had their own particular area of specialization and expertize, you could reasonably expect them to be able to hold their own in any of the four sub-disciplines.

But time marched on, and anthropology again became larger and more complicated, so that by the time I got to it, anthropologists were no longer expected to possess four-field competency.  By this time, four-field exposure was considered to be adequate (this should not be taken as a comment on the quality of education in regard to anthropologists.  A person can only be expected to carry so much around in their head, and as the field expands, the requirements must narrow).  Today, I wouldn’t be at all surprised if four-field exposure is no longer considered necessary.  As time has gone by, the field of anthropology has become more and more specialized.  And this occurs even within the sub-fields.  American archaeology is immediately, broadly, divided into two categories:  Prehistoric and historic (the dividing line being drawn at the arrival of white people).  This gets even further subdivided, in ways I won’t go into here.

So a couple of generations ago, any given anthropologist could reasonably have been expected to be able to ‘do anthropology’.  Dropped into any situation in which the skills of an anthropologist (of any sub-field) were needed, it could safely be assumed that they would be able to perform as necessary.  A generation later, this was no longer the case.  And we get further away from it every day.  This is mainly because anthropology is (as previously stated) a young science.  It really hasn’t been around very long, and it’s barely out of its toddler stage.

This progression is not unique to anthropology.  All sciences grow from infancy into maturity (a state yet to be achieved for many sciences), the major difference being the length of the time period over which this occurs.  For some, it’s centuries.  For others, generations.  For others, considerably less.

Which brings us to GIS.  While you could (rather effectively) argue that the practice of GIS has been around for centuries, the discipline of GIS has only been around for a few decades (give or take).  In that time it has progressed through infancy and well into toddler-hood, possibly beyond.  The speed with which this occurred can be problematic.  As an anthropologist, I could safely stand on my own generational ground and look behind and before me.  I could see the ‘old days’, where four-field competency held sway and where anthropologists could be expected to ‘do anthropology’.  I could simultaneously look forward to where anthropologists would no longer really understand the interrelationship of the four sub-disciplines and where specialization would hold sway.  In the field of GIS, however, many of us have watched the development of our discipline happen right in front of us.

My first exposure to GIS was in the form of a class called ‘Computer Mapping’ (that’s right – while the term ‘GIS’ had been around for a short while, it hadn’t yet graduated into common usage).  For software, we used MapInfo (waaaaaaay before Pitney Bowes).  I was (as you may have guessed) studying archaeology at the time, and the usefulness of GIS to the discipline did not escape me.  The purpose of the class was to (eventually) produce a road atlas.  The end result for me personally was to seal my doom and condemn me to a lifetime of Map Dorkitude.  Toward the end of that class, I purchased my first copy of ArcView (for only $250.  At that point, at least, ESRI offered substantial discounts to students).  I spent the following Summer teaching myself how to use it (Map Dork!  Map Dork!).  By the end of that Summer, I think it’s safe to say that I was quite able to ‘do GIS’.  Because – let’s face it – at that time, a general proficiency with ESRIWare equated to an ability to ‘do GIS’.

But that was a long time ago (in GIS-time, at least.  Not so much in real-time.  I’m not that old), and the ability to narrow GIS to a particular skill set (or software vendor) is long past.  Sure – there’s a certain baseline skill set – a core of knowledge – that all practitioners of GIS possess and use, but the field has progressed so far beyond the baseline that the mere possession of the basic tool kit no longer enables or qualifies a person to ‘do GIS’.  As a matter of fact, the very idea of ‘doing GIS’ has almost become absurd.  We cannot assume that a speed skater and a football player are engaged in the same activity because they are both ‘doing sports’.  Neither can we say that a chemist and a geologist are engaged in the same activity because they are both ‘doing science’.  Therefore, I think it’s a little ridiculous to say that a person setting up a server stack and a person taking a waypoint on a mountaintop are engaged in the same activity because they are both ‘doing GIS’.  GIS – as a discipline – has progressed too far and grown too much and gotten too complicated to wrap into a single package that a single individual can ‘do’.  So the idea that one can be certified to ‘do’ GIS is either an extreme absurdity or an extreme conceit.  In either case, it’s a concept I refuse to buy into.

And this brings us to the first and primary reason I won’t have anything to do with GISCI and their GISP program.  I don’t believe that GIS can be effectively stuffed into a pigeonhole that would easily lend itself to certification.  In all fairness, though, I’m not all that sure GISCI claims to certify people to ‘do’ GIS.  From their home page:

“A GISP is a certified geographic information systems (GIS) Professional who has met the minimum standards for ethical conduct and professional practice as established by the GIS Certification Institute (GISCI)”

In other words, a GISP is someone who has been certified by GISCI to be – well – certified by GISCI.  I leave it to the individual to determine the value of this.  Now – it could be that somewhere within GISCI’s ‘minimum standards’ lies the ability to ‘do’ GIS (as they perceive it).  I can’t really say, because I haven’t been able to find a definition of GIS on GISCI’s website (I’ll be the first to admit that my search for one has not been exhaustive).  I did find this tidbit, though: “The GIS Certification Program is an opportunity to define the profession of GIS.”

So, to recap:  By paying GISCI, not only can we become certifiably certified, but our certification may someday help us to determine what it is that we are certifiably certified to do (although actually doing it may require another certification).  Hot damn!  Sign me up!

But let’s try to be charitable here.  Maybe GISCI is sincerely trying to respond to a need, however ineptly they may be doing so.  Does GIS – as a discipline – need some sort of certification or licensing to achieve legitimacy?  I believe this is a valid question, and I think the answer is “no”.  This question has been debated within the community for quite some time now, and the opinions seem to be pretty evenly divided.  If you think about it, this is one of those cases where a lack of consensus equates to a “no”.

Let’s take it a step further:  If not necessary, would such a certification or licensing process be desirable?  For much the same reasons, I think the answer to this would also have to be “no”.  In this case, though, I don’t think it’s so much that the community wouldn’t like to see something of the sort in place, but what they would like to see (in such a case) is something other than what GISCI has to offer.  And GISCI seems to be doggedly determined to stick to their program.  And they seem equally determined to convince the rest of us that we need what they’re selling.  Another reason I’m not interested.

Other than that, the only thing GISCI and their GISP program seem to be offering is a code of ethics.  Sorry, but I again feel the need to respectfully decline.  It’s not that I have anything against codes of ethics, per se, it’s just that I find them to be pretty much useless.  There is an old saying:  ‘Locks only stop honest people’.  In a similar vein, a code of ethics will only really be adhered to by people who don’t, in fact, need such a code in order to act ethically.  Those who are prone to act in an unethical fashion will certainly not be stopped by a code of ethics (especially when it really counts – when nobody’s looking).  A code of ethics is only useful when it has teeth.  GISCI’s code is only enforceable with those who live in fear of having their certified certification taken away.

So at the end of the day, GISCI simply isn’t offering anything I have a use for.  I am not saying that their program is without value, just that it holds no value for me personally.

Which brings us to the last item on my list, and the only one that I feel could actually be called a ‘complaint’.  While I have no use for GISCI and GISP, they do not, in fact, annoy me.  What does annoy me is their fanboys.  I’m not talking about their proponents and/or supporters, many of which I have had lively, entertaining and informative conversations (sometimes even arguments) with.  I’m talking about the zealots.  Like the yahoo who told me that GISP is not about competency but about commitment to the profession.  In much the same way the phrase “No thank you. I have my own belief system and would rather not read your literature.” somehow transforms into “No thank you.  I’d rather eat babies and burn in Hell.” on the trip from your mouth to the ears of the stranger who came knocking on your door, so does the statement “I don’t need a certification to validate what I do.” somehow become “Your support of said certification invalidates you and what you do.”  In some quarters, the support of GISP borders on the religious.

So allow me to make this as clear as I can:  My indifference to your certification does not – cannot – invalidate it.  My opinion of GISP does not determine its value or lack thereof.  If pursuing and achieving a certification is meaningful to you, then you should by all means do so.  But do not expect me to attach the same meaning to it.  I will make my own choices in the matter.

And, if it’s all the same to you, I think I’ll just express my commitment to the profession through a dedication to competence.

Signature

Shoe PhoneA short time ago, I read an interview with a science fiction writer (I forget who, regrettably) who was lamenting the fact that he was going to have to extensively re-write one of his stories.  The reason?  The story in question was about to be made into a film (or possibly a television program) but before this could happen the ‘technology’ in the story had to be brought up to date.  The author specifically made mention of the cell phones his characters carried.  It seemed the characters worked for some sort of government agency and therefore the phones they carried were some sort of uber-techno, bleeding-edge dream devices.

Or so the author believed at the time.  He thought he had dreamt up some pretty incredible, technologically advanced devices.  By the time the interview took place, though, the author lamented that more impressive phones could be purchased for less than $10 at Wal-Mart.  Technology had not only caught up with his imagination, it had surpassed it.

I bring this up because I have recently gotten a new cell phone.  My old one (a Motorola Razr that I loved dearly and that probably made me seem far cooler than I actually am) started to act up, so I checked with my carrier and discovered that I was due for a free upgrade.

I took a good, long, iLook around me, and came to the conclusion that touchscreen phones are the wave of the future.  There seems to be a certain – inevitability – about them.  As though (at least in the immediate future) phones are going to end up with touchscreens whether we like it or not.  This being the case, I decided that I should procure such a phone for myself, the idea being that I could thereby slowly indoctrinate myself into the Mysteries Of The Touchscreen.

Now – I didn’t want to get too crazy about this, mainly because I didn’t know much about them.  I thought an iPhone would be nice, but I didn’t know enough about either the technology or what I really wanted from the technology to justify spending the money.  So I decided I would simply order whatever touchscreen phone I could get for free.  Just to get my feet wet, as it were.  I assumed the phone would pretty much be crap, but that through ownership of it I could learn enough about the devices to become capable of making informed (hopefully intelligent) choices when I finally do take the plunge.  This same theory worked quite well when I got my first GPS.

Well, I have to report that I seem to have been correct on all points but one.  That one point would be my assumption that my new phone would be crap.  This is far from the truth.  The phone I ordered is a Samsung Solstice A887, and there is no doubt that it is the best phone I’ve ever owned.  I don’t think a day has gone by that I haven’t discovered a new feature.  It’s almost frightening.

Don’t get me wrong – there are a couple of things I would change about the phone.  I would give it the ability to connect to the internet through my existing home wireless network (as well as other networks – say at coffee shops or airports) for one.  For another, I would change the USB/power jack to a standardized one so I could connect and charge my phone without their cord (the connection issue was easy solved with Bluetooth, though).  So far, though, those are my only two gripes.  And they are far outweighed by the pluses.

The phone has a slot for a standard microSD card, up to 16 gigabytes (which is enormous for a cell phone).  It connects to a pc via Bluetooth (as mentioned previously) and can exchange files either through the computer’s interface (which – in Windows, at least – kind of sucks) or through a program Samsung offers as a free download, and which has an adequate (and user-friendly) interface.  The phone natively plays mp3s, through external speakers or through the built-in one (It’s oddly like having your own soundtrack.  I chase my son around to The Imperial March from Star Wars).  It also natively plays mp4 video, although you need to bring some mojo to bear to adequately tweak them (I’m currently carrying around a Harry Potter movie, ripped off the DVD with HandBrake.  It’s shrunk to 320 x 180, with 24 fps, but it looks surprisingly good for something in a phone.  And the file’s only a tad over 500 mb).  The phone can also read ebooks, although I haven’t yet put this option through its paces – so far, I’ve only put my own work in my phone (short reference crap, mostly), since it can read either Word docs or PDFs without doing anything fancy.  There are also a bunch of built-in apps, most of which I don’t really know what to do with.  Probably why iPhones scare me a little.  I think I’d be paralyzed by all the choices.

I also have the ability to shoot video, which I haven’t really explored yet.  The camera’s only 2 megapixel, which is small by today’s standards but at the same time the same size as the Sony Mavica I took with me when I studied at Oxford for a summer (I took almost a thousand pictures that summer and they still look good).  The camera also has its features, some silly, some cool, most unexplored by me.  I have played with the panoramas, though.  This was what my office looked like this afternoon:

Office

Not too bad for a free phone.

Which brings me to the whole point of all this.  This phone was free, which means that – by today’s standards – it’s really not considered to be anything special.  And yet I used it to take that image from the mountaintop I stood on today (feel free to click on the image to see it full-sized).  And the device I held in my hand to take that image (and that I carried up there in my pocket)  also held several CDs, a few books and one movie, all of which barely scratched the surface of the device’s storage capacity.  And we haven’t even mentioned the device’s ability to connect to the rest of the world from a spot remote enough to challenge a GPS (but not really that remote – I do live in New England, after all).  As well as all the rest of the phone’s functions – from calculators to calendars to memos to banking to GPS to radio to video games to all the other stuff I don’t even know about.

All of that, and it’s not even the crazy part.  The crazy part is that all of that isn’t even considered to be a whole lot by today’s standards.  It boggles the mind a bit.

I have to say, though – it’s pretty damn cool.

Signature

Pluto

Update:  The boy and I just watched Interplanet Janet, from Schoolhouse RockShe considers Pluto to be a planet.  Who are we to argue?

Okay – here’s the deal.  I’m getting a little tired of this Pluto’s-Not-A-Planet crap.  Why, I ask, would Pluto be considered to be anything other than a planet?  ‘Why?’  The answer goes, ‘Because it doesn’t fit the definition of “planet”‘.

Huh?  When did this happen?

2006.  That was when the International Astronomical Union (IAU) decided (for whatever reasons) to write a new definition of ‘planet’.  Their definition is as follows:

1)  Have an orbit around a sun.

2)  Have enough mass to assume a (mostly) round shape.

3)  Have cleared the neighborhood in its orbit.

The third is the one Pluto falls short on, and for this reason they’re now referring to it as a ‘dwarf’ planet.  There are some in the profession fighting this (mostly because expecting Pluto to clear out the neighborhood is unreasonable, due to the enormity of its orbit), but so far they have been unsuccessful.  My take on this is that the IAU is going at this ass-backwards.

Years ago, the archaeological world had a list of criteria they used to define a ‘civilization’ (much like our planet-defining list above).  If I remember correctly (and I usually do), there were 5 items on the list, the pertinent one being possession of the wheel.  It was thought that a group of humans couldn’t reach the lofty heights of true civilization without first developing the wheel.  Then one guy, who had spent his life studying the Inca, raised his hand and said: “But – the Inca never developed the wheel”.  He was told that the Inca, having failed to measure up to the definition, couldn’t have been a ‘civilization’.   “But,”  he argued, “They Built Machu Picchu.  They had a trade network that spanned a continent.  They had suspension bridges, for Christ’s sake!”

“Hmmm,” said his colleagues, “Maybe we should re-think our definition.”

This, my friends, is how science is supposed to be done.  A good scientist does not look up on a overcast day and say:  “It’s not blue, therefore it cannot be the sky”.

Pluto has enough of a gravitational influence on our solar system that it’s presence was known decades before anyone actually ‘saw’ it.  It has three moons (that we know of), putting it ahead of Mercury, Venus, Earth and Mars.  Most importantly, in the decades during which Pluto’s existence was known but it had not yet been ‘seen’, it was known as “Planet X”.  This alone gives it more celestial street cred than all the other so-called planets combined.

It’s a friggin’ planet.  Just fix the definition, already.

Besides, we don’t really want to piss off the god of the underworld, do we?

Signature

Twittification:

  • RT @ShaunBarger: I hope every studio executive who ever hemmed & hawed about the monetary viability of female led blockbusters is choking o… 1 day ago

Blog Stats

  • 24,439 hits

Categories

August 2017
M T W T F S S
« Jul    
 123456
78910111213
14151617181920
21222324252627
28293031