Posted by & filed under Opinons, Tech.

Google has announced the end of Google Reader. The tech media breathes a collective “meh,” but they are missing the real tragedy here. There are all sorts of ways to read RSS. Some people read every article in a feed, while others watch it flow by in a river of news. The one thing that is constant in everyone’s consumption of RSS-based information is that they very likely do it from more than one place, more than one application, and probably more than one mobile or desktop O/S.

Here is the incredibly powerful thing that Google Reader provides that will leave a huge, gaping hole in my daily RSS reading:

Synchronization.

Google Reader was at best an average RSS reader. But it excelled at keeping all of my other 3rd party RSS reader apps in sync. By providing a set of APIs that allowed remote readers to mark/unmark individual articles as read, it let me start reading news on my phone with Feeddler, continue on my desktop with Google Reader, and switch to Flipboard on my tablet later in the day without having to wade through the same news articles twice. What was marked as read on my phone never showed up as unread on my tablet. It also gave me centralized management of all my RSS feeds. When I nuked an entire feed on my desktop computer, it  disappeared from my mobile devices.

When Google Reader shuts down, what is going to do that for me? The answer right now is “nothing.” RSS just became not only less valuable to me, but an actual impediment to how I consume information. People are saying that shutting down Google Reader is no big deal and it doesn’t mark the “death of RSS.” But losing this synchronization feature, an invisible but vital part of the RSS 3rd party app infrastructure, means that RSS reverts to something significantly less than it is now and removes its singular advantage for me over just serially visiting web sites in a bookmark list.

I hope someone is paying attention to the loss of this not-so-visible piece of Internet infrastructure and steps in to fill the void. Otherwise I’m gonna be cranky and have to write something myself.

Posted by & filed under News, Opinons, Tech.

With last week’s release of OS X 10.8, Mountain Lion, Mac users have been given a lot to digest. True to Apple form, there is a lot of marketing info about the release but not much in the way of specific changes or documentation for power users. Usually not a problem as the Internet does a good job of ferreting out the significant items. But there is one change that slipped under the radar that should have every Web developer who uses a Mac up in arms.

In OS X 10.8, Apple has eliminated Web Sharing. Yep, that’s right. The Sharing System Pref no long has a way to start or stop your Mac’s built-in Apache Web server. If you are a command line guru, you’ll already know how to do this or perhaps this. But most people who use their Macs to develop web pages, test software that works with web servers, or just host a small site for friends, family, or co-workers are out of luck.

What’s worse, Apple clearly didn’t think this through because they really hose people who are upgrading from 10.7 or earlier. Whatever you had Web Sharing set to do under your older O/S version is preserved and that’s what it will do when 10.8 starts up. That means if Web Sharing was enabled under 10.7, it’ll still be running under 10.8. But you will have absolutely no GUI to turn it off, or start it up again if you do. Derp. This little nugget of dumbassery prompted me to file a bug report with Apple ( #11982135 ).

As the long-ago author of the MacHTTP and WebSTAR web servers, I’m quite thrilled that Apple has opened up the market for third party web servers again with this bone-headed move. But I am sure it wasn’t their intent. It seems to be the opposite, actually, where someone in Apple marketing thought that disabling the Web server on the consumer O/S would somehow drive OS X Server sales. In any case, it’s time to dust off the old MacHTTP source code!

In all seriousness, however, one Mac developer has already solved the problem with a replacement preferences pane. So until Apple figures out what a bad mistake they’ve made, this is a reasonable work-around.

I guess the real question is, how can the hundreds of thousands of people who use their Macs as web servers let Apple know they screwed this one up? Not everyone can file bug reports. Suggestions?

Posted by & filed under Opinons, Tech.

This is an essay I wrote in June of 2002. It was written as a response to a blog post that Dave Winer made, questioning why software professionals don’t get the respect they deserve. I’m posting it again here, because the only other copy on the Internet is hosted on a site that isn’t mine. It’s a long article, but I think it takes a good stab at answering the question.


 

Doctors vs. Geeks 

I spent 3 years, from 1991 to 1994, working in the Texas Medical Center. It’s the largest medical institution in the world and at the time, it had the largest institutional information network in the world, too. It was an interesting confluence of geeks and doctors and an interesting time, right at the birth of the commercial Internet. I occupied a relatively senior position in the geek hierarchy and worked very closely on a daily basis with department chairmen, CIOs, CEOs, and world-renowned physicians from the various medical schools, teaching hospitals, universities, and clinics affiliated with the TMC.

I had been there little more than a month before I recognized something that I found more than a little disconcerting. Most of the medical professionals and quite a few of the PhDs as well held a high disdain for technology, specifically information technology. Oh, they liked things like video conferencing (for remote telemedicine applications), miniaturized electronics (for things like pacemakers, neural stimulators, etc.), and were always scheming new gadgets to make a particular surgery simpler or safer. But “mundane” technologies like cell phones, PDAs, pagers, the Internet, the Web, e-mail, and other sorts of information-related devices held little interest for them.

After a while, it became clear why. These professionals viewed their professions as the pinnacle of human learning, and the human body as the most complicated system possible. Yet, I watched them struggle to solve problems on a daily basis that someone with a rigorous training in logic-based problem solving could knock out in no time. Visualizing and producing solutions that led to better medical treatments, more effective experiments, more powerful drugs, or some other “system-related” result always seemed to be a matter of trial and error. Information systems were never applied to their best effect and the outcomes always seemed to be more the result of a random walk than a purposeful journey.

Oh, it wasn’t because the geeks didn’t try. Hundreds of us worked on things like better patient record systems, improved image analysis for pathology samples, simulations of various human organ systems, huge expert systems for patient diagnosis, better communications infrastructures for distance learning, telemedicine, and even remote surgery. The problem is that none of the geeks’ efforts stemmed from the worlds of physiology, endocrinology, nephrology, neurology, gastroenterology, cardiology, or any of the other dozens of -ologies that medical school students are inundated with. We were technology generalists and that high-level system view of information flow, system interactions, and design insight necessary to build a complex information system was completely at odds with the medical culture.

In this universe of academic medicine, specific knowledge ruled. Reductionists were czars. Questions like “how does this protein interact with that drug compound?” or “how do sodium channels in neurons affect neural capacitance?” always took precedence over more generalized questions like “is there a correlation between the bacteria load in a patient and the incidence of heart disease?” I watched doctors and PhDs who were experts in human physiology, who had studied the minute inner workings of the lining of the stomach, argue vociferously over something as simple as their belief that helicobacter pylori bacterium couldn’t be responsible for stomach ulcers.

Even when confronted with mountains of evidence that there was a relationship between a person’s genetic predisposition to infection by the bacteria and their incidence of stomach ulcers, these gentlemen refused to be convinced that their intimate understanding of acid production, gastric emptying, and other G.I. details wouldn’t provide the ultimate answer. Their inability to see the relationship between genetics, immune system, G.I. system, and other variables effectively blinded them to an obvious treatment for their patients’ ulcers — a simple antibiotic. Of course, this is common knowledge and treatment today. But 10 years ago it was something on the order of a holy war between believers and infidels, to be waged with new acid inhibitors, surgeries, and diets.

I think there are three fundamental issues that cause a great disconnect between geeks and doctors: culture, the nature of the systems they interact with, and “ease of use.”

Culturally, the medical profession and the information sciences seem superficially similar. Both require highly dedicated, technically astute individuals to spend years accumulating knowledge. The differences come from the way this knowledge is acquired and applied. Doctors draw on an enormous body of codified knowledge. Rules, relationships, anatomies, formulae, and procedures form a vast sea of information to be sorted, categorized, memorized, and retrieved on demand. It’s no wonder that some expert systems experience a higher diagnostic success rate than human physicians when confronted with complicated, obscure symptoms and pathologies. The amount of information to be managed and recalled in context is enormous.

I had the opportunity to watch an entering class of first year med students for their entire time at the University of Texas Medical School in Houston, through their third year and then their internships. It was an amazing transformation from wide-eyed idealist to detached, white-coated professional. I, personally, think the kids lost something important on the way to becoming doctors. The process involves replacing curiosity and the ability to ask questions with absolute certainty and the ability to make instantaneous decisions. After all, lives are at stake and hesitation or indecision can be fatal, or at least that’s what they’re taught. Add to that the fact that there are generally only one or two “accepted” ways to accomplish a particular medical outcome and it starts to become clear where the two cultures diverge.

On the other hand, geeks have relatively little codified knowledge to draw on early in the learning process. Culturally, geeks learn by doing. They take a few relatively simple tools and techniques that they learned in school and apply them, generally in an iterative process of learning, until the desired result is achieved. It would be impossible to try and memorize all the possible ways to implement a given information system. There are as many different techniques as there are individuals. This seems incredibly untidy to a scientist. There’s no “right” answer. And geeks learn by sharing, which is something that is culturally abhorrent in the world of academic research. In the highly competitive world of research grants, tenure, and “publish or perish,” the incentive to share has been all but removed. So it seems that these two worlds are culturally at odds from the start.

Couple that with the fact that the human body is an enormously complicated system. It’s clear that we don’t really understand it all that well. People still get sick and die. Surgeons still use knives to chop off parts of a sick body, internists still mix chemicals into humans to correct specific problems, and neurologists still treat the brain as a relatively fragile piece of meat to be poked, shocked, and chemically treated into proper function. It’s not that surgeons, internists, and neurologists aren’t extremely capable, well-educated, and talented individuals, because they are. It’s that they started with a system they knew absolutely nothing about and no tools beyond their five senses to uncover its secrets. It’s only recently that technology and science have advanced to the point that things are significantly different for doctors than they were 500 years ago.

Geeks started with nothing. No complex systems, no unknown processes, no mysteries of life. Information science sprang into being at most in a period of 75 years. Everything in a geek’s universe, including his tools, the systems he builds on, and the things he creates were all the product of other geeks. Geeks are the ultimate controllers of their universe because they created it, which is a luxury not yet afforded the medical profession. It means that geeks can have ultimate knowledge of any of their system. One day, that will also be true for the medical profession. Once the human body is completely understood, it will cease to be the most complicated system we know of. Geeks will simulate it. That will by definition be a more complicated system. But it isn’t the case now and that is where the biggest dichotomy comes from between geeks and doctors. Doctors deal with a mostly unknown system with emergent behaviors. There are virtually no information systems complicated enough to exhibit emergent behaviors yet. To a doctor, that means what geeks do is inherently less difficult.

My years in the TMC taught me that doctors tended to downplay the value created by geeks. It seemed too easily understood and easy to master. Realities aside, piling up centuries of arcane knowledge seemed inherently more valuable to them than something that they considered mostly a trade. This is crux of the third element, “ease of use.”

In the medical world, the user is the system (i.e., the patient). The doctor’s work product is hopefully an improved system. But the problem is the user. The user has little knowledge of the system, isn’t sure when or if it’s broken, and is actually fearful of the process of modifying or using the system. It’s not a user-friendly system or process and the user’s life is literally at stake. All knowledge of the system and how it is supposed to work is held by doctor, putting the user at a serious disadvantage and placing the medical professional in an unnaturally superior position of total control.

In the geek’s world, systems are specified by and used by the users. I’m talking about complex systems built by teams of developers, spanning large physical infrastructures and not relatively simple desktop applications created by one or two engineers. Military command and control systems, large telecommunications infrastructures, complex avionics systems, and large multi-user operating systems all fall into this category. While they may be too complicated for the user to create or maintain themselves, the process of specifying and using them isn’t inherently fear inducing. Geeks actually strive to serve the user and minimize the difficulty of using the system, hiding the complexities of the technology from the user. The intent is that the user is in control.

This desire to place the user in control results in an apparent oversimplification of geek-produced technologies and provides the users with an illusion of simplicity. The measure of a good information system is always its unobtrusiveness and ease of use. Nobody considers what transpires when they make a cell phone call, for example. But this is one of the most complex interactions of technology the average person performs on a daily basis. Understanding the intricacies of how the call is placed and possessing the knowledge to actually engineer such a system is certainly exceeds that necessary to extract an appendix.

So why do medical professionals place themselves in an intellectually superior position, and why do geeks adopt a diminished role? The answer seems obvious, that doctors deal with the lives of people and geeks just deal with bits of technology that make lives easier. That was certainly how several doctors and PhDs related it to me. And that probably would be the end of it if it were true.

Unfortunately, geeks have done too good a job of simplifying what they do for users, hiding the complexities of the systems they build, and their ultimate value to society. I think the reality is that doctors keep individuals’ physical infrastructure functioning and geeks keep society’s infrastructure functioning. How many people would lose their lives tomorrow if the telephone system stopped working? if the avionics in all 747s failed? if the GPS guidance system in a cruise missile failed? if the computer systems running the nation’s power grid failed? Don’t kid yourself. Geeks do a job that is likely far more important to the safety and welfare of a larger number of people than the medical profession does. They just do it in a quiet, unassuming way that doesn’t require elevating individual accomplishment. And people do take it for granted. That should be a tribute to the geeks for a job well done.

One day, the medical profession will have eliminated all of the mystery surrounding the human body. When that day comes, doctors as we know them will likely cease to exist. Medicine will just become one more system for the geeks to model and implement. Until then, people are going to continue to elevate those who keep them alive and continue to overlook the others who merely simplify their lives, even if they don’t know who’s who.

Posted by & filed under Tech.

This blog is weird. Sometimes it goes for months without stuff happening, then it gets a flurry of new content. We’re moving into the “flurry” stage again. I’ve spent much of the past year or so working with various mobile application technologies, settling on hybrid apps that include a substantial HTML5 component. This has led to some interesting business (and technology) opportunities and I’m going to try and document some of them here.

One of the biggest technical challenges has been getting content to render effectively first across phones and tablets, and then desktops. Using tools like PhoneGap and Sencha Touch have made that process fairly easy. Twitter’s Bootstrap framework takes the opposite approach of rendering nicely on desktops, but then scaling down to mobile devices. It’s what’s driving the new theme on this blog. Great for taking a content authoring tool like WordPress and extending it out (painlessly!) to mobile devices.

It’d be nice to see some sort of convergence happen between Bootstrap and the purely mobile frameworks like Sencha’s and the JQuery-based JQTouch and JQuery Mobile. Stay tuned and let’s see what happens.

Posted by & filed under Opinons, Tech.

I’ve been lending an occasional hand to Dave Winer on portions of his EC2 for Poets project that is building on the OPML Editor and other tools to create a “coral reef” alternative to Twitter-like services. One of the more thorny issues is how to deal with identity for users and their content across a big federation of systems, operating systems, and platforms.

The short answer is to use one of the ubiquitous “platform” services everyone pretty much has to use on the Internet. Things like email addresses, URLs, domain names, etc. all fall into this category. Email addresses are easy, but disposable, can be easily manipulated, and aren’t so hot from a Twitter-like syntax perspective when prefixed with “@”.

URLs are clunky and hard to remember, often break, get shortened and obfuscated, and require a server at the end of each URL.

Simple domain names, like chuck.shotton.com, would be an awesome alternative to service-specific usernames (e.g. @cshotton on Twitter) because they could be tied to a persistent portion of the Internet infrastructure. The problem is one of ease of use. DNS ju-ju is inscrutable to most folks. If it goes beyond setting up a home page on GoDaddy, it’s into the realm of grey-bearded Internet science projects when it comes to DNS, running a DNS server, and keeping one up to date.

Enter the idea of a REST interface to DNS. Something that makes it drop dead simple to add, change, remove, and otherwise operate the DNS system as an easy-to-use mapping between a simple identity and the location of some other resource. That’s exactly what DNS was designed to do. Unfortunately most people only know it as a way to map friendly domain names into gnarly TCP/IP addresses.

But suppose it mapped friendly domain names into some branch of your online identity? Instead of mapping to an IP address, it could map to a URL to your RSS feed. Instead of using some random, often-changing email address, you could map a fixed name that you kept forever into specific resources on the Internet that you were related to — web sites, Twitter users, RSS or OPML feeds, etc. Much better for the long term, since people can always find you, and you simply need to point your identity at a new resource. TXT records in the DNS system are perfect for this. But the semantics of the TXT strings needs to be defined for this to work.

In all fairness, the market has already produced a service that pretty much allows all this to happen as part of the coral reef. DNSimple.com gives a great set of APIs for managing domains and DNS records through simple Web interfaces that anyone able to tweak a little Javascript can call remotely. Whether it’s the right answer, or if there’s an open source alternative, is one of several questions to resolve.

Here’s what I think the pending questions are that need answers before this solution takes off:

  • REST APIs for DNS – what is the best solution for widespread developer access and support?
  • DNS TXT records – what standards, if any, already exist for using these DNS fields to map to URLs or other identity methods?
  • If nothing exists, what should go in TXT records? (e.g. some XML fragment like <url=”http://shotton.com/feeds.opml” /> ?)

I’m interested in trying to leverage DNSimple or a comparable service to start providing a prototype of this identity mapping through DNS service. Anyone else want to play along?

Posted by & filed under Tech.

In case people show up here looking for an old post on rssNimbus, the source code and links to documentation can be found here.

Posted by & filed under Opinons, Tech.

The following tweet yesterday fired off a lightbulb of sorts:

suelyn @doctorow Will digital publishing be the end of bricks & mortar libraries.

Let me put that in context. My buddy, Dave Winer, has been on a long time quest to find, foster, create, or instigate some way for there to be a long-lasting, secure, independent archive of our lifes’ digital works. Dave has a huge body of mostly-digital creations, but he’s also interested in providing a repository for his dad’s blog, photos, his uncle’s writings, family memorabilia, etc. It’s something everyone needs, from the simplest of bloggers to the most well-read A-List journalist — some place where your stuff can be read 100, 300, or a thousand years from now, without fear of being censored by a government, neglected by a corporation, or abused by cyber-viginantes.

Lots of the focus has centered on long-lived institutions like the Library of Congress or colleges and universities. But the tweet made me think of an equally long-lived set of institutions that are fundamentally aligned with the interests of a free and open society — local public libraries.

The local library exists because it is important to the community. They serve as focal points for community activities, with events and programs for toddlers, kids, and adults alike. Meeting rooms, multimedia, and computer labs all make up a part of the 21st century public library. Unfortunately, the writing is on the wall regarding their most fundamental service, lending physical books to their patrons.

In the absence of a compelling follow-on mission, some could be prompted to make a tweet such as the one above. Once we all have Kindles, who needs the library? Here’s the lightbulb.

There are literally thousands of public libraries in the US today (122,101 according to the American Library Association), and most are well-conneted to the Internet and have modern computing facilities for use by staff and patrons. And there are easily that many again scattered around the rest of the planet. A quarter of a million sites for information storage is a pretty compelling bit of redundancy! So, why not leverage a few simple technologies and create the long-lived, public digital archive that piggy-backs on the public library mission?

Imagine a large, distributed, redundant file system (there are lots to choose from) with a well-connected peer to peer application suite that archives, shares, replicates, and serves digital content across the planet. A botnet of sorts for our digital masterpieces. Each library only has to commit a relatively small amount of CPU, network, and data storage (probably just one dedicated PC with a few terabytes of storage) and we suddenly have petabytes of redundant digital archive that could last as long as our respective societies place a value on free access to information.

The implementation is left for another post, but the idea is certainly compelling. Librarians have shown themselves to be paragons when it comes to their mission of adopting and adapting information technology in a manner that serves the public. Let’s figure out how to give them their next mission and build the tools for them to do it!

Posted by & filed under Opinons.

This WikiLeaks crap bothers me. This morning, PayPal has shut them off, making the judgement that their activity is somehow illegal without any hint of due process. There is a story not getting told, which is all about who is pressuring Amazon, PayPal, DNS providers, and ISPs to turn off WikiLeaks.

Kiddie porn sites, illegal gambling sites, etc. all get a blind eye. Now that someone is pissed about WikiLeaks, these companies all suddenly take the moral high ground. Why isn’t anyone asking THEM why? Why now? Really asking them and not just taking their bogus PR as an answer. (It sure as hell isn’t that tool Joe Lieberman.)

Here’s the last best shot old school news organs have at leveraging their news desks and they are cowering. *I* want to know the answer, and short of a blogged leak from inside PayPal, we’ll never find out. Investigative journalism seems to be over.

Posted by & filed under Opinons, Tech.

“Do I really look like a guy with a plan? You know what I am? I’m a dog chasing cars. I wouldn’t know what to do with one if I caught it. You know, I just do things. The mob has plans, the cops have plans, Gordon’s got plans. You know, they’re schemers. Schemers trying to control their little worlds. I’m not a schemer. I try to show the schemers how pathetic their attempts to control things really are.”

The Joker, The Dark Knight