Progression of the information economy

posted in: Publishing | 0

Once upon a time, the New York Times used to be a real newspaper.  These days it’s just a mouthpiece organ for the publishing industry, because they’re mostly all in New York and a lot of the writers for the Times want to switch over to the publishing houses.  But back in the day, they used to write real news and real articles.

This is one, and a good one at that.

It talks about Microsoft’s Encardia product, which was a software encyclopedia.  See, back in the very early 90s, paper encyclopedias were still a thing.  For the kids out there, they were these huge sets of books; and not small books either.  Usually both thick and big, about the size of a phone book and maybe half as thick each one.  Oh, kids haven’t ever seen phone books either I bet.  Hmmm … ah.  The encyclopedias were about the size of a Rolling Stone magazine, or a maybe 80% of a supermarket checkout lane tabloid paper.  Or, something even today’s digital age kids have seen, about the size of a college ruled notebook.  But thicker; usually about two inches thick.

These encyclopedias were supplied content by staffs of writers and fact gatherer people who would run around (in some cases, especially back before the early 80s, literally) or call around and assemble data for the writers, who would then churn out ‘readable’ articles that would then go into the encyclopedias alphabetically.  Most sets of encyclopedias were at least 40 books, because when a set of books was trying to be the answer-for-everything reference set in a library, they needed a lot of room.  Stacked vertically on the floor, a set of encyclopedias was often taller than a 5th grader.

The companies that created these encyclopedias charged a lot for them.  Thousands of dollars.  Most of their sales were to libraries.  Now, keep in mind, this was before publishing contracts started getting really vicious, greedy, and unreasonable.  So even though it might cost a school or college or public library $1500 or $2000 or $3000 or more to get a single set of encyclopedias (and most had several sets from multiple publishers), the contracts weren’t written to demand the libraries had to rebuy new ones every so often.  And books, especially books that don’t leave the library, can last a long time.

Of course, being paper, the data within them couldn’t be updated.  The sets would have to be replaced.  Most libraries replaced them every decade or so, some less often than that, others more frequently.  And the encyclopedia publishers occasionally convinced a ‘normal’ person to buy one, which is why you might sometime hear a comedian joke about ‘encyclopedia salesmen’; that used to be an actual thing.  But the libraries could make their own decisions about when it was time to update their encyclopedias; they didn’t have the publisher threatening to sue them for breach of contract when they decided last year’s (or last decade’s) set was still ok to keep on the shelf.

Then things started changing toward the information age.  First the computer became a thing that normal people could have; not just the military or big companies, or then smaller companies, or then only rich people.  As the 80s started to close out, computer prices got cheaper and cheaper, and you could get one for about the cost of what a set of encyclopedias would run you.  Second, the Internet became a thing.  We all know how that worked out (pretty well).

Enter Bill Gates.  People often forget what old Bill was studying at Harvard, or assume it was technology or software or programming or something related to circuits and code.  Nope; Bill was studying marketing.  And, as it has turned out, he was also one of those rare people who had a vision of the future.  Rarer still, he was a visionary who did something to make his dreams come true.  That dream was Microsoft, which played a more or less irreplaceable role in popularizing and accelerating the adoption of computers, digital life, and the whole information age we have today.

As Windows started to get established, Bill was casting about looking for products to make people want to use Windows, and sometimes even products Microsoft could sell other than the operating system.  One of these ventures was a software encyclopedia.  The big challenge in building an encyclopedia is the data, the content.  The packaging and so forth is not complicated, but duplicating all of the information from scratch could be expensive.  Why was Bill bothering?  Because, as all of us in the information age have now learned and probably even forgotten about (we take it so for granted); computers make everything better.

Remember, the data in a old-style set of encyclopedias is enormous.  Four or five feet (or more) in height when printed, bound, and stacked vertically.  Even alphabetized, it takes time to look things up.  And all the wonderful tricks that digital bring to the process that make research so much easier are absent.  No search function.  No hyperlinks.  No easily available competing sources so you can compare and contrast.  No equally available reference lists for further research.  Nope, all you had was a single entry about “Aardvarks” or “Zebras” or whatever.

Bill wanted to change that, and he needed the content, so he went looking to license it.  And was shot down.  These old style companies had their own ideas about how encyclopedias were handled, and they didn’t see the point of bringing computers into the process.  They relied on their sales forces, and didn’t believe people would ‘buy’ encyclopedias unless they were sold on them.  And by sold, we’re talking about the salesman getting in on you and talking you into it.  Most of us know how that sort of transaction plays out; if you roll poorly on your save-vs-salesman you end up with something you’re later scratching your head and trying to figure out why you bought.

So the first few encyclopedia publishers said no, no, no, and definitely no.  Until Bill got to Funk & Wagnalls, which was one of the smaller publishers.  They were a real encyclopedia publisher, but they didn’t have the massive investment in their titanic brand that the leaders in the sector did; so they went with the proposal and sold Microsoft a non-exclusive license for the data.  That license became Encarta, backed by the power of Microsoft.

Bill did make mistakes.  He tried to price the product lower than the paper version, but it was still four hundred dollars.  And that was back when four hundred dollars was still quite a sum of money.  To put it in some sort of perspective, the computer plus monitor was usually at least $1200, and most sold for closer to (or over) $2000.  In the mid 80s, $2000 could get you a pretty nice used car in good shape if you did just a small amount of looking and shopping around.

Price elasticity theory came into the picture, and so did competition.  Microsoft soon was convinced to lower Encarta’s price to $99.  And, oddly enough, sales started to take off.  That was still a not-small amount of cash, but it was small enough that parents could fit it into a birthday or Christmas present to make their kids put on that fake smile we all use when we get socks or horrid sweaters.  “An encyclopedia, great, thanks mom.”  $99 was also cheap enough that libraries, which were starting to be relied on to have publicly available computers at that point, could have enough copies of Encarta to run on all those computers.

Digital started to invade.

Then, the wonderful world of Wikipedia came into the picture.  Newspapers and magazines did not know what to make of Wikipedia.  A lot of articles were written that ranged in tone from curious to dismissive.  Some were downright hostile.  “What is crowdsourcing” some of them asked.  “How can you have an encyclopedia full of entries written by normal people, instead of paid experts.”

We all know how Wikipedia has played out.  It’s become what is very probably the single greatest repository of information available to humanity.  Some people will certainly like to argue this point, and they’re welcome to have at it, but in terms of accessibility, relevance, speed-of-updates, and most other metrics that matter, it’s pretty hard to put together a resource that can beat the entire Internet.  As it turns out, there are lots of people who update Wikipedia.  And lots more who cross-check and monitor all those updates, keeping a lid on the kinds of things some naysayers warned about.

And a lot of the updaters are experts in their fields.  I’ve even read articles about some college professors setting tasks for their graduate students to do certain numbers and kinds of updates as part of courses.  For example, one article I read talked about a medical school professor who required his classes to review and correct, update, or expand on entries for anatomy, diseases, illnesses, and other human-biology-centric data.  Updates that not only the doctor would check (and grade for the class), but also the usual Wikipedia volunteers would keep an eye on too.

What’s my point?  Well, first of all, it’s usually cool to see how we got from there to here; where there is some point in the past, and here is the cool things we have now.  If you’d told regular ordinary people in the late 70s we’d all have effectively the sum total of all human knowledge available in our pockets or on our desks, hardly any of them would have believed you.  Or paid attention.  Yet, here we are now, with exactly that.  The thing that I used to always see in scifi when I was a kid, where the characters would ask the computer questions and it would immediately answer extremely comprehensively, that’s what we’ve got.  The only part of that particular little chain that’s missing is the voice-activated interface that can understand conversational English.

And today, even Watson is changing that.

But past the cool factor, it also (sadly) serves to illustrate yet again how people and organizations purposefully obstruct progress.  Above I touch on how the encyclopedia industry got in the way of our moving to digital information access.  They were invested in paper, in printing and selling paper, and they didn’t see opportunity but threat when software came along.  How much better would it have been (for everyone) if they’d embraced digital?  I don’t know if they could’ve beaten off Wikipedia, but if they’d transitioned with the information age instead of standing in place opposing it, they would have at least had some sort of a chance.

Watson is even getting opposition.  One of the key areas IBM has been focusing Watson on is medical, because it’s a very complicated subject with an immense amount of very technical, specific, and detailed information that every single decision made for a medical reason relies upon.  Every single one; even the routine ones.  Doctors toss off a casual “that’s just the flu, drink lots of fluids and rest up” because centuries of medical research have proved that answer to be the best that can be offered when a patient presents with flu.  But as anyone who watched House for the medical mysteries more than the character development knows, medicine gets very specific and that wealth of information – combined with the even greater amount of stuff we still just don’t know – can be daunting for even the most dedicated medical professional to correctly parse.

IBM’s medical implementation of Watson has a full medical database available to it.  Not just the medical textbooks and encyclopedias; but research papers, study results, theories; a lot of medical information.  The way Watson works (and works very well), is you give it the patient’s history and symptoms, and it will give you the diagnoses with probability figures attached that could cause the patient’s issue(s).  The key medical element that Gregory House exploited on the tv show was that everyone else didn’t go down the rabbit holes looking for the esoteric causes, for the rare diseases and disorders and illnesses.  That made the show interesting, obviously; but in real life, when you’re the one suffering from some ridiculously unheard of problem and years start going by without help, it really sucks.

I’ve read all sorts of articles over the years about patients, some who went decades, going undiagnosed or misdiagnosed.  I remember one where the patient had a very specific kind of coeliac disease that was the reason she spent twelve years struggling desperately to keep her weight (as a full grown woman) above 75 pounds because she literally couldn’t eat anything.  And doctor after doctor kept assuming she had some sort of mental condition.  It wasn’t until she got to like her 25th or 30th doctor that she ran into one that actually sat down, talked to and listened to her, and then read her history, and then consulted medical literature to properly research her condition and arrive at what turned out to be the correct diagnosis.  I’ve read others where the patient has an similarly extremely rare kind of cancer, or a blood disorder, or an almost unheard of genetic anomaly.  The pilot episode of Limitless last week has a moment where the main character (who’s a super genius most of the time) is able to diagnose his father’s chronic illness as a almost unheard of condition that is easy to check for if someone goes looking, but basically never gets checked for because it’s so rare.

We go to doctors because they’re supposed to be experts.  But they’re also humans.  It would be nice if every doctor was like House; geniuses with eidetic memory, but it’s not realistic.  Let’s get back to Watson.  Watson might not qualify as a genius in all of the ways people usually employ the term; but the medical version of Watson does have eidetic recall, and never misses a reference.  It will give you that complete list of possible causes that are ailing a patient.  It will even list tests and procedures that can be used to gather more information it feels it is missing for a complete picture; and, of course, it will list all the treatments for each and every condition it suggests as a cause.  Remember how House’s team would start running tests to track things down in some episodes?  Watson knows which tests to and not to run that will provide more illumination on the question before it.

And it does it at the speed of a computer.  None of this scifi dramatic tension where you input the question and the computer says “okay, come back in a plot-determined amount of time later”.  Watson answers basically in seconds.

Would it surprise you to hear more than a few doctors hate Watson?  Because a lot of them do.  They look at Watson as ‘replacing’ them.  Or as ‘wasting their time.’  “I’ve been a doctor for twenty-nine years; I know flu when I see it, and I know how to treat a concussion or a broken leg or a burn victim.”  I’m sure they’re all good doctors, some of them even great, and hopefully only a very few of them bad ones; but they’re all human.  Humans make mistakes.  It’s part of what being human means.  Humans fuck shit up.  It’s part of our wonderful curse.  No one is perfect.  No one.

Insurance companies have started dictating Watson or versions of it be used for things like prescriptions.  Anyone who takes medications regularly probably knows what drug interaction means; it’s exactly what it sounds like.  Not all drugs can be taken at the same time.  Some of them interfere with each other.  Some of them make others less effective.  Some of them, when taken together, are even dangerous.  Or lethal.  One quick example would be medications that lower blood pressure.  Taking multiple drugs with this effect can lower your blood pressure to the point where you collapse, and it can be fatal if not corrected in time.

Also, people have allergies.  Some drugs trigger allergies, and some allergic reactions are fatal.  People also sometimes have tried certain drugs that were previously prescribed to them and had bad reactions, or unexpected results.

Doctors are supposed to review a patient’s medical history, and consider all of these things with their carefully trained doctor’s minds, before prescribing medications.  But mistakes happen.  Doctors are human.  They miss things, or are too busy, or forget.  Insurance companies, who are often on the hook for wrongful death lawsuits, have started to demand computers be brought into the process to ensure these kinds of mistakes don’t happen.  So that bad reactions and dangerous interactions and improper dosage instructions don’t “slip through” the system.  Because not all mistakes are “oops, sorry, my bad” mistakes.  Some mistakes kill.

Yet we still have medical professionals, medical organizations, resisting this march of progress.  They don’t want it.  They don’t understand it. They feel they know better.  They don’t want to change.  They don’t want to have to check the computer.  They just don’t care.  Just like the encyclopedia companies didn’t care about the change that was coming.  Or the movie studios with VCRs or streaming.  Or the music labels with MP3s and streaming.

Or the kinds of publishers that work in my industry.

It’s tiring how familiar this all sounds.  When will people learn?