Thursday, September 30, 2004

Via Who Throws a Shoe, I learn that Mel Brooks has called Toronto's bagels "mushy". I can comment on this vital issue. Although I've never lived quite in Toronto, I've lived nearby and visit frequently. Also I've sampled Montreal bagels, New York bagels and many instances of second class bagels from around the continent.

You see, only Montreal and New York bagels are really the original ethnic bagel, made by Jewish people for Jewish communities with the same unadulterated ingredients (slightly different for these two cities, I believe) and quirky bagel-making techniques (including boiling the rings before baking in hot ovens, and who knows what other black magic). That may be because Montreal and New York have long had large communities dominated by Jewish culture, and the ability to affect bagel tastes throughout the rest of the city.

In the rest of Canada/US, bagels are poor imitations made with standard baking equipment and therefore with modified recipes. Donut stores that decide to add bagels to their menu often create pseudo-bagels that can be made with their existing equipment. The result is topologically the same as a bagel but otherwise sub-par -- a mainstream "bagel" is merely a bun with a hole. Even steaming equipment produces a milquetoast substitute. It's just not the same, but most North Americans don't know the difference or don't care. Perhaps some soft-gummed softheads even prefer the less chewy mainstream bagel.

So what's Toronto's story? It's mixed. Some parts of Toronto are like the rest of the wasteland outside Montreal and New York, as epitomized by the ever-present donut stores which make such great Canadian donuts (more on this another time) but such humdrum purported bagels. However, there is a strong Jewish community in some parts of Toronto (such as North York). So in these communities, if you know where to shop, you can find real boiled-and-baked bagels, real chewy and dense and shiny on the outside.

Mel, I'm sympathetic, but you should know better than to buy a bagel just anywhere.

Thursday, September 23, 2004

Another stab (like mine last spring) at why there seems to be a disproportionate fear for economics and economists: Arnold Kling puts forth a few possible reasons in his TCS article. He mentions the math barrier, but I still think that disliking the conclusions of economic thinking is one of the biggest reasons -- and can't be dealt with simply by teaching economics without math.

Monday, September 20, 2004

Another knitting project completed: a fuzzy baby cardigan.



Also see Ekr's post describing our weekend camping trip which was our toughest trip yet but I feel surprisingly good only a day later.

Tuesday, September 14, 2004

I can't resist exposing unintended consequences of well-meaning social programs. Here's a new one (to me) related to the minimum wage: by having a minimum wage, a state blocks entry to the workforce to certain very low-wage and entry-level workers, and this affects the stepping-stone process by which low-wage workers move up the chain to earn higher wages. In other words, a minimum wage hurts a state's rate of moderate wage employment later on. A great explanation at Marginal Revolution.

Friday, September 10, 2004

Open source and open standards

This is part two of a rumination on the trendy adjective "open", exploring why open source and open standards don't go together as well as one might think.

The phrase "open standard" is more loosely defined than "open source", although Bruce Perens is attempting another definition. Microsoft calls a protocol an open standard if the specification is publicly readable or if Microsoft allows or licenses others to legally implement it. However, Microsoft typically retains change control to its so-called open standards. Sun also maintains change control over its major standards and may not even have non-discriminatory licensing for implementations, but claims their standards are more open than Microsoft's because you can "check out" of the standard (ref). The war between Microsoft and Sun over who has more open standards is laughable.

I think these are the main criteria to judge how open a standard is:
  • Is published and available at low or zero cost
  • Can be implemented by anybody freely or with non-discriminatory licensing
  • Can be modified by a group with either completely open membership (IETF) or one that does not discriminate in membership applications (W3C).
  • Can be implemented interoperably on a wide variety of platforms and with a wide variety of technologies. (E.g. HTTP can be implemented in C, Java or most other languages, on Windows, Unix or just about any modern operating system.)
Anyway, for now let's call standards whose development is open to public contributions "public standards".

Why would an open source effort fail to support public standards, or fail to be interoperable? A lot of possible reasons here, many of which add up to either a high cost which open source developers can find difficult to bear, or lower benefit than a commercial product would gain from doing standards work.
  • It can be expensive to learn how to implement the standard. Books, workshops and seminars can help the developer learn the standard, or participation in a standards group which may require airfare or meeting fees to do effectively. Writing a from-scratch implementation of a public standard is not usually easy.

  • Developers may take shortcuts reading, understanding and implementing standards, trying to get something useful working fast rather than make the full investment. There may be certain features required by the standard but which aren't necessary for the product to work in the developer's main use cases. The shortcuts lead to less-interoperable implementations. Closed-source projects may take shortcuts too, but there are pressures to fix the holes quickly in order to claim compatibility, prove interoperability, and sell to customers.

  • Vanity, confidence, or ego: it's fun and impressive to design a protocol from scratch. An open source developer who is doing the whole project for fun may find the entertainment value more important than a paid developer.

  • Commercial products must have the credibility of being supported for the long run, and in planning for the long run, developers have other tradeoffs to make. For example, a protocol designed in-house for a specific purpose tends to have design errors which make it harder to maintain in the long run. There are a lot of subtle design points and extra effort involved in making a protocol extensible and flexible. Later on, continuing to support the protocol with newer versions of the software may come to be quite a pain. If they're wise and can afford to do so, developers of long-lived software make the decision to invest early in supporting a standard protocol rather than invent their own with unknown problems.

  • What if the developer can put in the effort to implement the public standard, but it's not quite ideal for the developer's purposes? It's possible for the developer to influence the public standard, but contributing to standards takes a long time and a lot of effort. If the existing standards don't quite match up to the requirements, an open source developer may not have the resources or time to help it change. Thus, the developer may choose to only partly implement the standard, or implement undocumented extensions, at a detriment to interoperability.

  • There's the assumption in open source that because you can read the source, the source is the specification. Why should an open source developer take the time to publish a specification for the protocol when you can just read the source to see how it works? So therefore when open source developers do extend or invent a protocol, the resulting protocol often remains undocumented, which isn't conducive to interoperability.

  • Interoperability testing can be expensive. It might require revenues of some kind to be able to afford interoperability testing. And if you can't even afford interoperability testing, it's harder to justify implementing the interoperable standard.
Although it seems a little rude because Subethaedit is a wonderful tool, I'll pick on Subethaedit for a minute. It's free, it's open source, it has a relatively open process, and it even supports some public standards. It uses Rendezvous (an Apple preview of Zeroconf, which is a public standard in development) and BEEP (an IETF public standard). However there is also a non-public protocol used over BEEP to communicate around the actual changes to the communally edited document. Thus, it would be a challenge for somebody to write another client that interoperated with Subethaedit.

Sadly, many open source projects (as well as many closed source) use the phrase "supports open standards" or "based on open standards" as if it were pixie dust, automatically conferring open interoperability. That's not the case. An open source project can easily fail to be interoperable with other projects, just by innovating new features without rigorously documenting and designing protocols and protocol extensions.

Some open source projects overcome all these hurdles and support public standards, which I, personally, am very grateful for. Once in a while open source developers actually design a new protocol which is interoperable, flexible and unique enough to become a widely supported public standard, and that too deserves kudos.

Tuesday, September 07, 2004

Open as in source, or Open as in process?

A long-standing terminology issue in free software is whether it's "free as in speech, or free as in beer" (ref). A related confusion arises with the adjective "open". The term "Open Source" is most authoritatively defined by the Open Source Initiative. Other phrases like Open Standard and openness (as a way of operating an organization) are more loosely used. With efforts like Open Content and Open Law (not to mention OpenCola), openness is clearly in vogue now.

These "opens" don't always go together, despite common assumptions. There's an automatic assumption that these are all meritorious and that anybody with the merit of doing open source software also has the merits of following open standards and an open process.

Open source and open process

Since "openness" is too vague, I'll take a stab instead at considering "open process". The phrase is relative: a more open process is one which is more transparent, less restricted in participation, and less obscure. For example anybody may search the Mozilla bug database and see bug status (transparent) and anybody may submit a bug (unrestricted participation). In addition the bug database is easy to find and there are instructions for use (not obscure).

Why would an open source effort fail to have an open process? Simple -- a process has a cost that must offset its benefits, and an open process can have a higher cost than a closed process. A small group of programmers (one to three, maybe four) don't get much benefit from formal processes, let alone formally open processes. The way to contribute to an open source project, or how submissions get accepted, or what feature get added, may happen in a very opaque way; decisions may be made by one person or made in private conversations among a small group.

Contrast that to a company where there are very likely to be customer requests and some standard process to handle them. The process may very likely be closed and private, but it's also quite possible for a company to have an open process to discuss feature inclusion. For example, at Xythos, we started a customer forum where customers could get together and discuss the merits of various feature requests. The final decision was up to us but the process was not entirely closed. Some open source projects don't even have a forum for discussing feature requests, or a public list of feature requests.

Of course, open source efforts are probably more likely overall to have open processes. Sites like Sourceforge contribute to this synergy by opening up bug tracking and discussion forums in a cheap way -- open source contributors don't have to make a huge investment to make the development process more open as well.

At OSAF we put a lot of effort into ensuring that our processes are transparent, participatory, and not obscure. It's been estimated that one part of that alone, the weekly IRC chats, consumes nearly one morning per developer per week -- that's enormous. Documenting everything we say in in-person meetings and keeping public Wiki pages up to date are other major costs. Obviously we hope that doing the "openness" thing confers benefits in the long run to the larger community, but at such a high cost, it's obviously not a cost every open source group can bear.

Part II of this post, discussing the weaker synergy between open source and open standards, will come this weekend I hope.

The CalSch Working Group is being closed. I believe this is a good thing -- IETF working groups which stick around for years and years tend to shrink and get dysfunctional. Closing the group clears the slate.

This doesn't mean that no standards work is going to get done in calendaring. In fact, it seems quite the opposite since in the past six months there's been quite a surge of interest in calendaring features and calendar interoperability. The new lists I mentioned in a previous post have seen a fair bit of traffic. The access protocol discussion list (ietf-caldav@osafoundation.org )has 96 members and 72 posts in under a month. The other list discussing bringing the calendar format standards to a new level of RFC (ietf-calsify@osafoundation.org), has 85 members and 154 posts in the same time period. I've talked to a number of different companies, organizations and even venture capitalists about calendaring standards action, and there's an interesting roundtable coming up in Montreal.

Since I have been dragging people into discussing calendaring standards for a year now, and published a calendaring standard proposal eight months ago, I feel like my timing, for once, is right. Maybe I'm one of the butterflies furiously flapping my wings to create a bit of wind-speed out there today.

Blog Archive

Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 Unported License.