Friday, October 15, 2004
Tuesday, October 05, 2004
Ingredients:
- 1 cup half and half cream
- 1/4 cup grated fresh ginger root
- 1 cup heavy whipping cream, whipped to soft peaks
- 1 cup ginger ale or diet ginger ale
- 1/2 to 3/4 cup sugar or Splenda
- 1 T. lemon juice
Soak the grated ginger root in half and half cream. Put this and the ginger ale and whipping cream in the fridge to chill for an hour or so. Use a fine sieve to take the fresh ginger out of the cream. Stir the ginger-flavored cream together with the whipped cream, ginger ale, sugar/Splenda and lemon juice. Immediately put in your ice cream maker etc..
Mmm! I imagine for those who love ginger and can also eat sugar, candied ginger would be a nice addition to the recipe.
Monday, October 04, 2004
- RFC3920, Core protocol (framework for applications like notifications, as well as instant messaging and presence)
- RFC3921, Instant messaging and presence over the core protocol
- RFC3922, how to map XMPP presence information and instant messaging to the same common interchange format that SIMPLE instant messaging can map to
- RFC3923, requirements for end-to-end security of instant messages even when routed over more than one protocol
The work began earlier than two years ago, of course. Jabber was designed while Jeremy Miller was waiting for the IETF to come up with a standard IM protocol (and taking too long). When the IETF process still appeared stalled, Jeremy and others proposed that the IETF could standardize Jabber, by creating a working group to design a revision to Jabber that met the IETF standards for internationalization, security, XML namespace usage and a few other things. A BOF meeting was held in July 2002, very eventful and entertaining as IETF meetings go. The WG was formed, and I was added as a co-chair to help Pete Resnick, on October 31 2002. Peter Saint-Andre authored *all* of our internet drafts, doing a huge amount of difficult work in an extremely timely, professional and skilled manner. Whew! It's been fun!
Thursday, September 30, 2004
You see, only Montreal and New York bagels are really the original ethnic bagel, made by Jewish people for Jewish communities with the same unadulterated ingredients (slightly different for these two cities, I believe) and quirky bagel-making techniques (including boiling the rings before baking in hot ovens, and who knows what other black magic). That may be because Montreal and New York have long had large communities dominated by Jewish culture, and the ability to affect bagel tastes throughout the rest of the city.
In the rest of Canada/US, bagels are poor imitations made with standard baking equipment and therefore with modified recipes. Donut stores that decide to add bagels to their menu often create pseudo-bagels that can be made with their existing equipment. The result is topologically the same as a bagel but otherwise sub-par -- a mainstream "bagel" is merely a bun with a hole. Even steaming equipment produces a milquetoast substitute. It's just not the same, but most North Americans don't know the difference or don't care. Perhaps some soft-gummed softheads even prefer the less chewy mainstream bagel.
So what's Toronto's story? It's mixed. Some parts of Toronto are like the rest of the wasteland outside Montreal and New York, as epitomized by the ever-present donut stores which make such great Canadian donuts (more on this another time) but such humdrum purported bagels. However, there is a strong Jewish community in some parts of Toronto (such as North York). So in these communities, if you know where to shop, you can find real boiled-and-baked bagels, real chewy and dense and shiny on the outside.
Mel, I'm sympathetic, but you should know better than to buy a bagel just anywhere.
Thursday, September 23, 2004
Monday, September 20, 2004
Also see Ekr's post describing our weekend camping trip which was our toughest trip yet but I feel surprisingly good only a day later.
Tuesday, September 14, 2004
Friday, September 10, 2004
This is part two of a rumination on the trendy adjective "open", exploring why open source and open standards don't go together as well as one might think.
The phrase "open standard" is more loosely defined than "open source", although Bruce Perens is attempting another definition. Microsoft calls a protocol an open standard if the specification is publicly readable or if Microsoft allows or licenses others to legally implement it. However, Microsoft typically retains change control to its so-called open standards. Sun also maintains change control over its major standards and may not even have non-discriminatory licensing for implementations, but claims their standards are more open than Microsoft's because you can "check out" of the standard (ref). The war between Microsoft and Sun over who has more open standards is laughable.
I think these are the main criteria to judge how open a standard is:
- Is published and available at low or zero cost
- Can be implemented by anybody freely or with non-discriminatory licensing
- Can be modified by a group with either completely open membership (IETF) or one that does not discriminate in membership applications (W3C).
- Can be implemented interoperably on a wide variety of platforms and with a wide variety of technologies. (E.g. HTTP can be implemented in C, Java or most other languages, on Windows, Unix or just about any modern operating system.)
Why would an open source effort fail to support public standards, or fail to be interoperable? A lot of possible reasons here, many of which add up to either a high cost which open source developers can find difficult to bear, or lower benefit than a commercial product would gain from doing standards work.
- It can be expensive to learn how to implement the standard. Books, workshops and seminars can help the developer learn the standard, or participation in a standards group which may require airfare or meeting fees to do effectively. Writing a from-scratch implementation of a public standard is not usually easy.
- Developers may take shortcuts reading, understanding and implementing standards, trying to get something useful working fast rather than make the full investment. There may be certain features required by the standard but which aren't necessary for the product to work in the developer's main use cases. The shortcuts lead to less-interoperable implementations. Closed-source projects may take shortcuts too, but there are pressures to fix the holes quickly in order to claim compatibility, prove interoperability, and sell to customers.
- Vanity, confidence, or ego: it's fun and impressive to design a protocol from scratch. An open source developer who is doing the whole project for fun may find the entertainment value more important than a paid developer.
- Commercial products must have the credibility of being supported for the long run, and in planning for the long run, developers have other tradeoffs to make. For example, a protocol designed in-house for a specific purpose tends to have design errors which make it harder to maintain in the long run. There are a lot of subtle design points and extra effort involved in making a protocol extensible and flexible. Later on, continuing to support the protocol with newer versions of the software may come to be quite a pain. If they're wise and can afford to do so, developers of long-lived software make the decision to invest early in supporting a standard protocol rather than invent their own with unknown problems.
- What if the developer can put in the effort to implement the public standard, but it's not quite ideal for the developer's purposes? It's possible for the developer to influence the public standard, but contributing to standards takes a long time and a lot of effort. If the existing standards don't quite match up to the requirements, an open source developer may not have the resources or time to help it change. Thus, the developer may choose to only partly implement the standard, or implement undocumented extensions, at a detriment to interoperability.
- There's the assumption in open source that because you can read the source, the source is the specification. Why should an open source developer take the time to publish a specification for the protocol when you can just read the source to see how it works? So therefore when open source developers do extend or invent a protocol, the resulting protocol often remains undocumented, which isn't conducive to interoperability.
- Interoperability testing can be expensive. It might require revenues of some kind to be able to afford interoperability testing. And if you can't even afford interoperability testing, it's harder to justify implementing the interoperable standard.
Sadly, many open source projects (as well as many closed source) use the phrase "supports open standards" or "based on open standards" as if it were pixie dust, automatically conferring open interoperability. That's not the case. An open source project can easily fail to be interoperable with other projects, just by innovating new features without rigorously documenting and designing protocols and protocol extensions.
Some open source projects overcome all these hurdles and support public standards, which I, personally, am very grateful for. Once in a while open source developers actually design a new protocol which is interoperable, flexible and unique enough to become a widely supported public standard, and that too deserves kudos.
Tuesday, September 07, 2004
A long-standing terminology issue in free software is whether it's "free as in speech, or free as in beer" (ref). A related confusion arises with the adjective "open". The term "Open Source" is most authoritatively defined by the Open Source Initiative. Other phrases like Open Standard and openness (as a way of operating an organization) are more loosely used. With efforts like Open Content and Open Law (not to mention OpenCola), openness is clearly in vogue now.
These "opens" don't always go together, despite common assumptions. There's an automatic assumption that these are all meritorious and that anybody with the merit of doing open source software also has the merits of following open standards and an open process.
Open source and open process
Since "openness" is too vague, I'll take a stab instead at considering "open process". The phrase is relative: a more open process is one which is more transparent, less restricted in participation, and less obscure. For example anybody may search the Mozilla bug database and see bug status (transparent) and anybody may submit a bug (unrestricted participation). In addition the bug database is easy to find and there are instructions for use (not obscure).
Why would an open source effort fail to have an open process? Simple -- a process has a cost that must offset its benefits, and an open process can have a higher cost than a closed process. A small group of programmers (one to three, maybe four) don't get much benefit from formal processes, let alone formally open processes. The way to contribute to an open source project, or how submissions get accepted, or what feature get added, may happen in a very opaque way; decisions may be made by one person or made in private conversations among a small group.
Contrast that to a company where there are very likely to be customer requests and some standard process to handle them. The process may very likely be closed and private, but it's also quite possible for a company to have an open process to discuss feature inclusion. For example, at Xythos, we started a customer forum where customers could get together and discuss the merits of various feature requests. The final decision was up to us but the process was not entirely closed. Some open source projects don't even have a forum for discussing feature requests, or a public list of feature requests.
Of course, open source efforts are probably more likely overall to have open processes. Sites like Sourceforge contribute to this synergy by opening up bug tracking and discussion forums in a cheap way -- open source contributors don't have to make a huge investment to make the development process more open as well.
At OSAF we put a lot of effort into ensuring that our processes are transparent, participatory, and not obscure. It's been estimated that one part of that alone, the weekly IRC chats, consumes nearly one morning per developer per week -- that's enormous. Documenting everything we say in in-person meetings and keeping public Wiki pages up to date are other major costs. Obviously we hope that doing the "openness" thing confers benefits in the long run to the larger community, but at such a high cost, it's obviously not a cost every open source group can bear.
Part II of this post, discussing the weaker synergy between open source and open standards, will come this weekend I hope.
This doesn't mean that no standards work is going to get done in calendaring. In fact, it seems quite the opposite since in the past six months there's been quite a surge of interest in calendaring features and calendar interoperability. The new lists I mentioned in a previous post have seen a fair bit of traffic. The access protocol discussion list (ietf-caldav@osafoundation.org )has 96 members and 72 posts in under a month. The other list discussing bringing the calendar format standards to a new level of RFC (ietf-calsify@osafoundation.org), has 85 members and 154 posts in the same time period. I've talked to a number of different companies, organizations and even venture capitalists about calendaring standards action, and there's an interesting roundtable coming up in Montreal.
Since I have been dragging people into discussing calendaring standards for a year now, and published a calendaring standard proposal eight months ago, I feel like my timing, for once, is right. Maybe I'm one of the butterflies furiously flapping my wings to create a bit of wind-speed out there today.
Tuesday, August 31, 2004
One theory for the extra attempts to appear neutral: since there's only two real US parties, any admitted bias is tantamount to admitting that the paper favours one of them. In Canada politics may not be quite so partisan or polarized, or at least haven't been for so long. A Canadian paper with a slight conservative bent could be a PC mouthpiece, or a Reform mouthpiece, or neither, so at least there's confusion about which party it's supposed to be the mouthpiece for. But I'm not sure this theory holds up because certainly Canadian politics can be partisan (and nasty), and UK politics may be polarized much like American.
Another theory is that since Americans are generally so quick to come up with conspiracy theories, Americans are therefore quick to assume that politicians are somehow controlling journalistic output. Therefore a paper must appear especially untainted to avoid being written off as government-controlled propaganda.
I'm not terribly happy with either of these theories, and maybe I'm wrong that this is uniquely American, so feel free to chime in.
Monday, August 23, 2004
At OSAF, unfortunately, we don't have very good specs. We have great designs, but it's really hard to go from the designs to know what exactly works and doesn't work, what's implemented and what isn't, and how the non-GUI choices were made, at any given release. It's hard for a developer to pick a feature and see how to implement it, because the developer could either see too much (the wiki pages or fragments that show the bigger context for the design, the aspects of the feature that we can't do today) or too little (the wiki pages or fragments that seemed irrelevant but were vital). Part of the problem may be that we didn't have anybody on staff whose job it was to write specifications, like program managers at Microsoft do. But part of the problem is certainly tools.
OSAF uses a Wiki for most of our current documentation, from proposals to user documentation to meeting notes to personal jottings. The wiki is very fluid and chaotic and not very authoritative. It seems a wiki can be made to be more authoritative -- the wikipedia does this by formalizing around a single kind of content -- but we can't easily do that because our wiki already contains such divergent content. Besides, a wiki isn't an ideal environment for long documents, for printing, for version control, or for template usage.
Length: Specs are ideally longer than the average wiki page. To fully describe a feature so that the different contributors actually agree (rather than just appear to agree), to document the decisions made, and to make the feature testable and documentable, might require a 30 page specification for a 1-dev-month feature. Note that for a long document readers require a table of contents -- ideally clickable if you're reading it on a computer, with page numbering if it's printed.
Printing: You really want to be able to print specs, even today. I'm tempted to bring printed specs to a meeting and ban laptops to get them reviewed properly. Reviewers can mark up paper. Implementors can glance between paper and screen (and write on the paper) as they work.
Version control: Specs should be versioned and checked in and out and maintained in a versioned repository, just like code. Some wiki software does this but it's not very good yet -- I couldn't view a version of this page from before the picture was added. HTML doesn't version very well in general partly because images are external. And if you break wiki pages into smaller-than-30-page chunks for readability, then the versioning gets worse.
Templates: For template support, I miss Word. When I worked at Microsoft, there were Word wizards who made beautiful, useful templates for many purposes, from expense reports to specifications. As a program manager, I wrote many specifications, and I found that a template somewhat like this one kick-started the process. Word also allows templates to contain "click to fill in" fields (like author, product, release, team information; spec status, implementation status) including support for drop-down boxes to limit selection to a few appropriate choices. My favourite spec template had sections for overview, administrative UI, setup concerns, test considerations, rough documentation plan, and internationalization issues. Each of these sections was a reminder that the feature being specified might have impact on other groups. When I joined a group that didn't have a template I made or modified one to start with because I found them so useful. The IETF has specification templates and they make for some consistency between vastly different protocols and authors. A familiar template vastly improves readability.
Am I missing something that would solve this problem in HTML or in another open format? Is there some easy-to-use tool or set of tools to solve this problem? I know I could write my own software to do this, and it wouldn't even have to be terribly complex -- no worse than the XML2RFC tool which produces IETF specifications matching the template. But that's harder both to write the template (and software to generate the document) and for the author who has to get the angly brackets right.
[Tangentially related: According to Mark's amusing post, as a spec reader, I'm a moron. I don't understand a spec merely by reading it through. After I've implemented a spec or tried to explain it to somebody else, or extend it or edit it, I begin to understand it.]
[UPDATE: Ekr and Sheila point out that a Wiki also sucks for entering text. It's not WYSIWYG, so the process of updating and viewing and correcting pages can be slow. Also you've got to be online to change a page. There's no way to synchronize parts of the Wiki for offline changes other than manual painful silliness. Ekr objected to the whole comparison in the first place. He said that evaluating a Wiki for writing specs was like evaluating a bunsen burner for baking bread.]
Friday, August 20, 2004
The obvious approach to teaching girls what it might be like to be a developer is to put them at a computer and show them code. Unfortunately that requires having access to a computer lab at the conference location and setting up all those computers with the same tools. I only saw this done once -- Shaula Massena ran a brilliant workshop where girls used Visual Basic to construct a little prank program they could install on Dad or Mom's PC at home. The program simly popped up a dialog saying click here to dismiss, only when you click "here" the dialog jumped somewhere else on the screen :) Shaula sent the girls home with floppies containing their program executable, very cool.
I eventually helped program managers, testers and tech writers come up with workshop plans that didn't require a computer lab and software installation. For testers, we brought some home appliances in -- mixer, hair dryer, toaster -- explained how to write a test plan, and asked the girls to execute the test plan. They explored edge cases like "what happens when you press the toast button when you're already toasting". The girls also always had fun criticizing the design and usability of the devices, which we encouraged.
For tech writers, the workshop involve writing instructions and then watch somebody follow those instructions to see how challenging it is to write clear instructions. I brought a pile of coloured paper and asked each girl to make a paper airplane (or other origami thing) and then on another piece of paper explain how to make the same airplane. Then the girls exchanged instructions and tried to follow somebody else's instructions. At the end we compared results between the original and the one made by following instructions. Here, fun was had throwing planes and decorating and naming them as well as constructing them. Some girls decorated their instructions pages too -- budding Web designers.
Finally, for program/product managers, my favourite workshop was "Design your own cell phone". I focused the discussion of features by introducing the concept of a target audience and a limited budget. What features are so important for your target audience that you just have to have them? The girls split into teams and came up with lovely ideas. Often, of course, the target audience was "teenage girls" and one group came up with a "privacy phone" that would stop your bratty brother from hearing any of your conversations or messages. But one group targetted old people and thought what it would take to handle a user with poor eyesight and hearing. And another group targeted young kids (or really, concerned parents of young kids) and designed a brightly-coloured egg-shaped phone, wearable on a lanyard around the neck, with only three programmed "send call" buttons so that the kid could only call appropriate authority figures to arrange pick up time, report an emergency, and so on. The girls thought that the phone should also have toy-like features so that the kid would have something to play with besides calling Mom, Dad and Emergency, so they thought there could be a couple more buttons to play ringtones or record and playback thoughts.
For six years I've thought that the kidphone would in fact make a cool product. Finally I find validation via Gizmodo: the MYMO phone for kids. I should have hired that team of girls and founded a company.
Sunday, August 15, 2004
Wednesday, August 11, 2004
I've been following calendaring standards for six years now. In that time, iCalendar has become a fairly widely accepted standard for expressing events, todos and other calendar information in a MIME text document. Many calendaring applications can import iCalendar files, or export to them, or generate an email with an invitation formatted as an iCalendar document. However, there are some problems with iCalendar's complexity, particularly in expressing recurrances, and the companion standards for conveying iCalendar files (iMIP and iTIP) have their own problems.
Calendar interoperability testing has happened sporadically in these years. The first calendar interoperability event was held in 2000, a virtual event in 2002 and another in-person event this summer at Berkeley, organized by the newly revitalized CalConnect consortium. Still, interoperability hasn't improved as much as we'd like because at some point you need to go back and fix the standard.
Also during these six years, the CalSch working group has worked and bickered over the proposal to standardize how clients access a calendar stored on a server -- this protocol, CAP, would be to calendars what IMAP is to email folders. I've never liked the design of CAP, down to the very core model of how it works. So six months ago I made a counter-proposal, CalDAV, and threw it out there as an internet-draft. Finally I'm getting more than just nibbles on CalDAV, in fact several groups have told me their intent to implement CalDAV starting this year. And at the same time, other folks are getting invigorated about revising iCalendar and bringing it to draft standard.
This is all great news. Calendaring interoperability has languished except for that burst of productivity back in 1998. People are locked into one calendar application depending on what server technology they have available, since there's no common calendar access standard. Invitations work, kinda, but in practice the problems with recurrances mean that people must follow up the machine-readable text with human-readable text in case a mistake was made between two different vendors' software.
Good news, but nowhere near done yet -- this is just the beginning. Now we need volunteers. We need people to write specification text, review text, and manage the issues. We need people simply to pay attention to the work being done and provide their experience, or simply opinions, to bring issues to resolution.
Here's where to start -- two new mailing lists, one to take iCalendar to draft standard, one to develop and standardize CalDAV. Tell your friends! Subscribe to one, subscribe to both! We've got good work to do and good people to do it but we need your help.
Monday, August 09, 2004
Like Saint Peter, I'm not concerned about what Tim Bray calls the "severe angst" that appeared in badattitude. I think when you invite a group of people together to complain and rant, they do so, attempting to be humorous and entertain each other, at the expense of reflecting the entire reality and balance of their opinions. In fact, several of us have noticed that ever since badattitude started existing during IETF plenaries, there's much fewer dissent voiced at the microphone. Badattitude only has about 60 people and there can be 600 in the actual plenary room so it's hard to believe that allowing 10% of the people to let off steam can directly reduce the microphone usage by a noticeable amount (clearly more than 10%). Theories include:
- Uncorrelated -- people stopped complaining during open-microphone for other reasons.
- Tipping point -- the few people whose frustration was reduced through badattitude, and didn't go to the microphone, brought microphone usage below some tipping point, influencing others not to complain as well.
- The Squawker Squad -- the possibility that the people I know happen to be those most likely to complain, and by drawing them into badattitude I induced a significant percentage of the bellyachers to give their opinions elsewhere.
I was supposed to have a full transcript of the badattitude jabber room due to the automatic logging functionality of the jabber client, but nitro ate the chat log. Really -- it's gone, except for the first five minutes. Honest. Anyway, some folks logged in under their real names and might not like it published.
Friday, August 06, 2004
A Wiki is a Web site running Web application code that allows people to browse Web pages, easily add new Web pages, and easily edit any Web page through Web forms (the Web application code powers those Web forms and handles their input). Originally Wiki described the first such application but the word has spread and similar applications that run completely different code are also called Wikis.
WebDAV is an IETF Proposed Standard, RFC2518, describing a set of extensions to the HTTP protocol. These extensions allow a client application (or other software agent) to create new HTTP resources more easily, manage them, find them without following hyperlinks, and change them in a way that multiple authors can work together simultaneously.
Wiki uses HTTP forms to transport requests | WebDAV extends HTTP by defining new methods. |
Wiki is a UI. | WebDAV is completely orthogonal to a UI. |
There exist a number of different Wikis, implemented in different languages, supporting different features. | There are a lot of different WebDAV servers as well as WebDAV clients. The clients can theoretically all work against any server although in practice there are a few outliers (special purpose clients or servers) that don't interoperate so well with the main group. |
Wiki is a thin-client application, where any browser can be the client because the smarts are on the server (the code running the Web forms). | WebDAV was created to allow rich clients to author Web resources. A WebDAV client is definitely the entity in control -- but you have to download or find WebDAV client software. Luckily, these clients ship now in all Windows and Mac OS's. |
Wiki is a generic name that seems to be used if something is similar to original Wiki | WebDAV is a set of requirements, so a WebDAV implementation can be tested for compliance. |
You'd probably not call something a Wiki, if it didn't enable multiple users and allow easy creation of new pages, and result in a Web site collection of human readable pages. | WebDAV can be used to do synchronize with a source code repository, or to change configuration files, or to access a calendar server, or many more things that you would never call "shared authoring".
|
It's easy to extend (a) Wiki, just code in the new feature -- e.g. search. | It's hard to extend the WebDAV standard, it requires IETF drafts, reviews, and many revisions in the IETF doc format. It can take years -- e.g. SEARCH, begun around 1998. OTOH it's easy to extend a WebDAV server -- e.g. Xythos added anonymous permission tickets to the WFS server -- but no other clients are likely to interoperate with a custom feature like this for a while. |
Anyway, in this case, the different roles mean that Wiki and WebDAV are not at all incompatible, and to compare their advantages and disadvantages as if they are both apples can be misleading. Brad, whom I met at a party last week, realizes this: he added WebDAV support to an existing Wiki codebase. It allows another way to add pages to a Wiki or to modify them, and can interact completely smoothly with the existing WebUI. You'd think somebody would have done this before now, but I can't see that anybody has. Way to go Brad.
[UPDATE Aug 9: clarified "extend webdav" to differentiate between extending the standard, and extending a server]
Tuesday, July 13, 2004
What happened to these? There were some technical issues (performance, bandwidth, client deployment and server deployment barriers) but I don't think those were paramount. I think it was the social and infrastructure barriers combined with a simple model mismatch. First I'll explain the model mismatch.
The Web is a client-server, or request-response medium. A client requests a page, the server responds with the page, done. The connection closes. The client has the page and can display it for the user as long as needed, can cache it for reference later. In what sense is that "visiting" a Web page? How can you say that two users are "on" the same Web page at the same time? They're each viewing local copies of the same page, copies which were obtained independently and at two different times. Even if users loosely think of this as "visiting" a Web page, the protocol model doesn't support that user model very directly. So you have to extend the Web with non-Web-like logic in order to allow people to hook up on a Web page, and that's just harder due to the infrastructure.
The infrastructure around the Web makes all Web extensions more difficult. Groups like XCAP (part of the SIMPLE WG) and Atom are trying to extend the Web in rather Web-like ways and they have lots of trouble dealing with infrastructure like proxies. It's even harder to extend the Web in non-Web-like ways like shared presence on Web pages. For example, caches don't work at all with shared presence systems -- the cache returns the Web page without asking the original source server for the page, so it's impossible for the source server to know everybody looking at the page without disabling caching (which doesn't entirely work). Clients which automatically download Web pages (like syndication browsers) or download through a proxy (like anonymizers or clients behind corporate firewalls) naturally mask the user's identity and time of access.
In addition to the intermediaries, there's the end points -- clients and servers. Client software is sometimes built to allow add-ons (IE and ActiveX, Firefox and all its plugins) but most users don't use them and are trained to distrust extensions. Client-side scripting languages are implemented differently on each platform so many sites support IE only for rich interactions. Server administrators also hate installing new software or new add-ons, particularly ones which allow users interacting with each other -- partly for the scaling and social discord reasons I cover next.
So finally, if all these model, infrastructure, software and technology issues are overcome, there's the social issues.
- People protect their identities and hate cases of mistaken identity or identity deception.
- People like anonymity and being hidden viewers, when it comes to themselves.
- People resent anonymity in others and like to have an accurate idea who else can "see" them.
- People protect their own privacy -- they don't like to be seen online without warning. Many Mac users turn Rendezvous off when they're not using it.
-
- People don't take active steps to become visible online unless they have something they want to get out of it. Many Mac users don't turn Rendezvous on unless they are reminded of it somehow *and* can think of a good reason why to turn it on.
- Social sites are prone to spam, and people are really sick of spam and don't tolerate many new sources of spam.
- Social interactions can get nasty either intentionally or accidentally. Trolls have been around as long as there have been public newsgroups and probably longer. Long-lived chat sites may require moderators, a combination of good technical "fixes" and volunteers, or may be restricted to a small sub-community that tolerates flames, trolls and spam. Ed Krol's Hitchhiker's Guide to the Internet tried to address some of these issues back in '89.
So with all these problems and stresses, social groups online have a tendency to burn out. There's been papers and even a whole course on virtual community design, mechanics and governance.
I am guessing that in part, simple Web-page presence tools are just too lightweight to reasonably deal with (or provide the subtle cues and tools humans need to deal with) these social issues. Once you build a system that can really sustain any kind of interactive community and identity, it's gone far beyond the Web request/response architecture (even if it's loosely built on that) and involved many sophisticated features. It seems so tempting to build something simple and Web-like but human models and interactions don't simplify that well.
Monday, June 21, 2004
Monday, June 07, 2004
The media, of course, is very bad at conveying complex nuanced system characteristics. I was recently pointed at an article with the headline For-Profit Hospitals Cost More. Cost more to whom? To the patient? To the tax-payer? One of the reason that for-profit hospitals cost more, the article says, is because they pay taxes, so presumably the tax-payer carries some of the burden of non-profit hospitals. And why would one believe that this had any relevance for the Canadian debate? It's quite possible that both for-profit and non-profit hospitals are more effective than government-run hospitals. Not that we can agree on what 'effective' means, anyway.
The more I learn about this, the less I know. I have seen government inefficiencies in the Canadian system -- my grandparents don't get their choice of doctor, even if they don't like their doctor, because they live in a small community and they see the doctor they're told to see. They can't drive a little farther, change plans, or pay more (or differently) no matter how much they dislike their doctor. But I've seen lots of inefficiencies surrounding the American insurance system too, particularly the employer-mediated stuff and the requirements for documentation of prior insurance or pre-existing conditions. I don't know. It all makes me unhappy, and suspect that healthcare is simply an intractable, money-wasting or unfair system, no matter how you slice it.
Monday, May 24, 2004
Tuesday, May 11, 2004
Blind studies show that users can't distinguish between search results from Google, Ask Jeeves, Yahoo, and Teoma. Yet when you put a logo on the page, users show a decided preference for Google. To me, that totally debunks the idea that Google's search algorithms built on the professional-journal-references model is the key to its success. As The Wall Street Journal's Lee Gomes put it: "Some say Google is the Rolls-Royce of search; maybe what it really is, is the Nike. Googlephiles may think they are exhibiting technical sophistication by their loyalty, but what they are really proving is the extent to which they have been conditioned to respond to logos and brands, just like street kids with their sneakers." (ref)
I can't get at the blind study information, unfortunately -- Fawcette's link is to a WSJ subscription article. Anybody got a pointer for me? I'd like to figure out if it was just the logo added to the page that made the difference, not any other formatting. To me, it seems the primary suckage of the MSN search engine was its interface (which is now, oddly, much like that of Google's). So it's not so surprising that by slapping the same interface on results from different engines meant users couldn't distinguish. There are other possible explanations too, besides pure brand loyalty.
Monday, May 10, 2004
Wednesday, April 28, 2004
Tuesday, April 27, 2004
However you should be careful about creating transports or bridges as it may violate the usage policies for foreign services.
Hard to tell what kind of book this is, right? It's "Instant Messaging in Java" by Iain Shigeoka which I happen to have at work.
I found this on Ted's blog and Ted references Danah Boyd... There's a certain etiquette in the blogosphere which is to credit where you got something like this. But neither Ted nor Ted's creditee invented this so I started following links because I wanted to find out "why". Ted Leung (apr 25) credited Danah Boyd (apr 18) credited Caterina (apr 11) credited David Chess (apr 11) credited long story short pier (apr 8) credited Elkin (apr 8) credited happy_potterer (apr 8) credited sternel (apr 7), who credited nobody in the post itself. The really bizarre thing is that after following this chain I looked at the comments for Sternel's post and somebody else posted a comment asking where Sternel got it from. So, onward: in the comments Sternel credits pegkerr (apr 8), credits kijjohnson (apr 7) credits mckitterick (apr 7) credits bobhowe (apr 7) credits both silvertide (apr 6) and curmudgeon (apr 6). Silvertide credits curmudgeon too. Curmudgeon credits kricker (apr 6) credits cynnerth (apr 6) and pbsage. PBSage also credits Cynnerth. Cynnerth credits seamusd whose journal I can't see. But now I can see the other self-acknowledged geek trying to track back this meme. Apparently it originated with some "find page 18, look at the fourth line" live journal post from who knows who.
Google has "about 15700" links for the search for "Find the fifth sentence" (in quotes) -- all of them blog entries with exactly this meme.
Friday, April 23, 2004
Thursday, April 22, 2004
Tuesday, April 20, 2004
- Peer-to-peer communication models, frameworks, toolkits: JXTA, Rendezvous, Howl and Jedi.
- Python libraries for WebDAV and IMAP support: Twisted and Python standard libraries
- Replication and synchronization: basic master/slave pattern, master/master pattern, RSync, RSync over HTTP, OceanStore, How to Select a Replication Protocol, the HTTP Distribution and Replication Protocol, HARP, State Replication Protocol, MoteFS, SyncML, problems in treating distributed data as if it were local (Notes on Distributed Computing), Coda (Disconnected Operations in a Distributed FS).
- How to do engineering task management: bug dbs, Project, Excel files, other...
- Good IDEs that support Python: boa constrictor, pyEclipse, pydev, TruStudio, Wing.
- Permissions solutions: WebDAV ACLs, IMAP ACLs, LDAP ACLs, NFS ACLs; Secure Interaction Design, Capabilities v. ACLs, WebDAV tickets (capabilities), Capability theory, Protecting Information,
Inspired by Ted and the people Ted linked to, I'm also thinking of putting together a post on influential software/CS papers. Maybe tomorrow.
Friday, April 16, 2004
Friday, April 09, 2004
Monday, April 05, 2004
Spurlock had the idea for the film on Thanksgiving Day 2002, slumped on his mother's couch after eating far too much. He saw a news item about two teenage girls in New York suing McDonald's for making them obese. The company responded by saying their food was nutritious and good for people. Is that so, he wondered? To find out, he committed himself to his 30 days of Big Mac bingeing."If there's one thing we could accomplish with the film, it is that we make people think about what they put in their mouth," he said. "So the next time you do go into a fast-food restaurant and they say, 'Would you like to upsize that?' you think about it and say, 'Maybe I won't. Maybe I'll stick with the medium this time.'
Does he really think every time fast food chains offer to supersize or upsize, that customers agree to it? And if so, that they eat every bite? If they did, it would be no surprise if they gained 25 pounds, as Spurlock did, and had a skyrocketing cholesterol level. Note that he also limited his exercise during this period, although I would think simply eating far beyond the point where you feel full, several meals a day, would be the root cause for the bad effects he experienced. In other words, it's not the food itself but the quantity -- he ate an average of $28 worth of food each day, which (according to the price info I could find) means at least seven Big-mac value meals a day!
A different McDonald's month diet with different rules could easily have a very different result. I tend to agree with this woman who believes she can eat only at McDonald's for a month and lose weight. Her rules are pretty flexible but they definitely don't require her to super-size or eat every bite. Or to look at it another way, I suspect if I ate $28 worth of even Fresh Choice meals every day (particularly the pasta, muffins, etc) for 30 days I'd also gain weight.
Thursday, April 01, 2004
Monday, March 29, 2004
- The "I haven't used my turn signal in five years" signal. This warns me that the driver is likely to merge into my lane without warning.
- The "coffee cup in use" indicator, hooked up to the driver's side cup holder, turns on for five seconds after the cup has been removed from the holder, to warn me to watch out for violent swerves due to hot liquid spills.
- The "child in car" light turns on when the child seat is occupied (clearly this requires modified child seats with radio signals and weight detectors). Now I know if the parent is likely to weave back and forth as they reach back and try to pick up the needed toy.
- The "music volume" indicator tells me if the driver is likely to be listening to overly loud music and have their foot too heavy on the pedal -- and unable to hear me honk.
Clearly these indicators would improve public safety, and since we know there's no cost too high to save a child's life I expect these improvements will be adopted immediately.
Friday, March 26, 2004
Tuesday, March 23, 2004
There are two basic approaches to single-client multi-protocol IM support:
- Use a client that supports many protocols natively. I've used Trillian to do this on Windows systems, and was happy enough to pay the $25 for Trillian Pro so that I could run the Jabber plugin within Trillian. On a Macintosh, I've started using Fire, which is free and just works (for Jabber connections, just put the whole JID (user@example.com) in the buddy "name" field.)
- Use a client that supports Jabber natively, and a server that gateways to the other services. For example, Exodus is a focused Jabber client (it does a good job of supporting Jabber rooms which is not too common yet), and it also allows you to talk to AIM/Yahoo users through a gateway if your Jabber server has gateways.
The second problem with gateways is privacy (or security). The way AIM and Yahoo are designed means that the gateway has to connect as you -- AIM and Yahoo weren't designed to have servers involved that aren't their direct control. So you have to tell the gateway your account and password information for it to impersonate you.
Many in the XMPP/Jabber community think gateways are the way to go in the long run, despite the usability and security problems. Here are a few reasons:
- It's a lot of work to write a client that supports so many protocols. Coding/design effort has to go into handling all the protocols, rather than good user experience. The client may not be free after so much work -- take Trillian pro for example. But even if you're paying for Trillian Pro, its developers/designers decided not to handle all of the features of all of the systems Trillian connects to. For example, Yahoo supports an "invisible" mode, but even though Trillian connects directly to Yahoo using your Yahoo account it won't set you to "invisible".
- It's a lot of user work to maintain two to six different accounts, which means identities, and juggle them all. What happens when I use AOL, Yahoo and Jabber with Fire, and I want to connect to a friend that uses AOL, MSN and Jabber with Trillian. Do we use AOL or Jabber? Which one works best? Do I have two listings for my friend in case one is unavailable?
- The protocols used by AOL and Yahoo IM change frequently. Every time that happens, Trillian and Fire users have to upgrade their client software. It's less of a burden overall to upgrade a gateway that serves hundreds or thousands of users, than to upgrade all those users clients.
- In the long run, the hope is that all IM systems will accept that IM is an Internet-wide system, not controlled by any one company. When AOL, Microsoft and Yahoo accept this, they'll start to migrate their systems towards the IETF standards that have already been put in place for IM addressing and for standard presence information formats. Then gateways will work a lot better, without so many hacks for address translation and presence information translation.
John's tips for gatewaying with Exodus
John just got gatewaying working with Exodus and documented his steps.
- First, find a Jabber server that supports gatewaying. Not all of them do -- e.g. "jabber.org" doesn't do gatewaying. If you've already got a Jabber ID on a non-gatewaying server, you can keep it there and use a different server for gatewaying, but John doesn't recommend that -- there will be some availability hiccups as you now depend on two servers being up, not just one.
- Next, you need to define and register with each gateway service first, the process that involves supplying your identity and password for that service and binding it to your Jabber ID.
- Finally, to create a new buddy through a gateway, you have to put the transport type ("AIM" or whatever) in the "Contact type" field, the handle or screen name (only) in the "Contact ID" field, and the server name (in foo.bar form, not aim.foo.bar) into the "Gateway server" field. That results in a Jabber ID of ContactID@aim.foo.bar, but this shouldn't be a user issue.
Monday, March 22, 2004
Wednesday, March 17, 2004

Wednesday, February 18, 2004
...fields, when ordered by size, appear to yield an asymptote which is interpreted as evidence of an approaching limit... in fact, the asymptote appears to be nothing more than a statistical artifact--that is, use of a large population, ordered by size, will frequently yield an exponential curve with an apparent asymptote.
What I think this means: if Cambell had done the same analysis on another set numbers with a similar distribution, such as the number of bloggers that link to each blogger (the power curve), he'd have seen an approaching limit on that data set too. The key is the distribution - a few large discoveries (or popular bloggers) and many small discoveries (or ordinary bloggers).
It goes on - the Lynch study is quite readable and entertaining particularly if you visualize the blood on the floor. Figure 6 is great, showing how widely an actual production curve diverges from the doomsayers' predictions.
The other contrarian article simply debunks the common meme that TV and movies make kids more violent. Some miscellaneous evidence:
- "SUNY-Albany sociologist Steven Messner has found that metropolitan areas in which violent TV programs attract especially large audiences have lower rates of violent crime."
- "the homicide rate barely changed from 1945 to 1967; the big increase started in the late 1960s, suggesting that something other than TV was at work"
- "The murder rate remained constant in Italy and declined in France, Germany, and Japan following the introduction of TV.
Monday, February 09, 2004
Economists should not be surprised that individuals underestimate the effect of inflation on the demand for their own services. One of the most significant differences between trained economists and the lay public is economists’ greater appreciation of general equilibrium. The cognitive difficulty of general equilibrium has been indicated by the fact, noted by the Commission on Graduate Education, that even economics graduate students do not give the correct explanation for why barbers’ wages, in the technically-stagnant hair-cutting industry,That's from a Akerlof paper on appropriate levels of inflation.
have risen over the past century [Krueger, 1991, p. 1044]. If economics graduate students fail to appreciate the effects on barbers’ opportunity costs from wage increases due to productivity change outside the hair-cutting industry, it would be a stretch to expect the lay public to see that as inflation rises the demand for their services (in nominal dollars) will similarly rise with it.
Friday, February 06, 2004
- Getting cold can give you a cold
- We have less free time than we used to
- American families need two incomes
- Money can buy happiness
- Republicans shrink the government
- The rich don't pay their fair share of taxes
- Chemicals are killing us
- Guns are bad
- We're drowning in garbage
Except for money buying happiness, and (depending on your beliefs) Republicans shrinking government, it's a good thing that all these myths are false. Why do we believe myths that make us also believe life is worse than it really is?
Sunday, February 01, 2004
Thursday, January 22, 2004
But I can't believe Kling's second hypothesis, that a fear of math is at fault here. I've known people with a University degree in Math (also Computer "Science", also the hard sciences) to have economo-phobia. Kling's sample must be too small for his hypothesis.
There are other possible models to explain economo-phobia, assuming of course it's a real phenomena (Kling and I could be deluded, after all...).
- Pre-scientific models are replaced by generational churn, more than by rational thought. (For that matter, scientific revolutions also happen only at the pace of a generation). International trade in labour is too new to understand.
- The data that falsifies common-sense models is too hard to see. We can't directly link a drop in the price of the new pants we buy to a friend losing an IT job and deciding to go back to school.
- Common-sense models lead to plans like rent control. When these don't work, we're notoriously bad at throwing out models-- in fact we may become emotionally attached to them, rather than go through the unpleasant process of admitting rent control is not really possible.
- Economics uses the same words or terms as moralists, only perverting those terms from what a moralist understands. For example, "correct" outcomes and "value". Not to mention, a "good". It's difficult for armchair moralists to accept that terminology.
- Economists must overcome a certain "gag reflex" when putting a value on certain things, such as a human life.
- Economists seem to oppose certain moral-seeming actions. Taking the example of rent control again, surely it's a proof of moral upstanding to want poor people to afford decent living quarters. When an economist opposes a plan like rent control, she seems to oppose the whole idea of providing poor people with decent living quarters.
If common sense and morality are arrayed against economics (even if economics isn't against them), it's not too surprising that we find an understanding and acceptance of economics so rare.
Wednesday, January 14, 2004
Here's my own geek knitter cred: I actually knit Fibonacci-striped socks (described in more detail on my knit stuff gallery). I also want to do a cellular automata lace shawl, which you'll understand why when you see Debbie New's beautiful cellular automata sweater.
Tuesday, January 06, 2004
Such are the marks of intelligence in the people of these countries, which are very sparsely populated, especially those of the Soriquois and Etechemins, which are near the sea, although Membertou assures us that in his youth he has seen chimonuts, that is to say, Savages, as thickly planted there as the hairs upon his head. It is maintained that they have thus diminished since the French have began to frequent their country; for, since then they do nothing all summer but eat; and the result is that, adopting an entirely different custom and thus breeding new diseases, they pay for their indulgence during the autumn and winter by pleurisy, quinsy and dysentery, which kill them off. During this year alone sixty have died at Cape de la Hève, which is the greater part of those who lived there; yet not one of all M. de Potrincourt's little colony has even been sick, notwithstanding all the privations they have suffered; which has caused the Savages to apprehend that God protects and defends us as his favorite and well-beloved people.
This is the kind of evidence that now leads many to believe that Europeans killed off "savages" through disease vectors. Guns, Germs and Steel was where I first read a bunch of stuff about this theory, it's a good book, but it's not the only place you see this nowadays. The letter quoted here was sent in 1611, one hundred years after the French believed they had laid claim to the area. Four or five generations after first claim, there were still "Savages" falling ill regularly. They weren't "breeding new diseases", of course.
Also in the original French:
Voylà les marques de l'esprit de cette nation, qui est fort peu peuplée, principalement les Soriquois et Etechemins qui avoysinent la mer, combien, que Membertou assure qu'en sa jeunesse il a veu chimonuts, c'est-à -dire des Sauvages aussi dru semés que les cheveux de la teste. On tient qu'ils sont ainsi diminués depuis que les François ont commencé à y hanter: car, depuis ce temps-là , ils ne font tout l'esté que manger; d'où vient que, prenant une tout autre habitude, et amassant de humeurs, l'automne et l'hyver ils payent leurs intemperies par pleurésies, esquinances, flux de sang, qui les font mourir. Seulement cette année, soixante en sont morts au Cap de la Hève, qui est la plus grande partie de ce qu'ils y estoient; et neantmoins personne du petit peuple de M. de Potrincourt n'a esté seulement malade, nonobstant toute l'indigence qu'ils ont paty; ce qui a faict apprehender les Sauvages que Dieu nous deffend et protége comme son peuple particulier et bien-aymé.
Wednesday, December 31, 2003
to play at on the heads of their drums, being only ruled by
hazard, and subject to knavish cogging; and as for the chesse,
I think it over-fond, because it is over-wise and philosophicke
a folly." -- Henry VIII, quoted in The sports and pastimes of the people of England, A new edition, with a copious index, by William Hone.
Thursday, December 18, 2003
That have been so bedazzled with Sun
(apologies to Shakespeare).
OK, well it's certainly not the first time Sun has been perceived as an enemy of open source, and apparently there is good reason:
Sun should have owned Linux and should have owned the community. It is Unix, and all Unix developers should have been Sun developers with Linux.This is from Ed Zander who just left Sun after being second to Scott McNealy for 5 years (ref).
Thursday, December 11, 2003
Saturday, December 06, 2003
Wednesday, December 03, 2003
Monday, December 01, 2003
- The WTO can act quickly (15 months, I think) in its rulings
- The rules are effective, if even Bush must cave
- The rules are principled, because Bush's argument for the steel tariffs was based on convenience. The excuse for the tariffs was that they were only temporary. When you argue for such a weak exception to a rule, that can reinforce the principle beind the rule.
If Bush had never put the temporary tariffs in place, there wouldn't have been this clear an opportunity for the WTO to strengthen its hand.
Seablogger has a good post on the event, including notes on the difficulty of agreeing that Kyoto is good even if we all agree that global warming exists.
There are plenty of reasons people are proposing these new commercial ventures, however. One proposed wind farm in West Virginia, would cost $300,000,000 to build, but would recover those costs and then some through various tax shelters and subsidies equaling $325,434,600. In many cases, the profit from this government largesse exceeds the income generated from electricity sales. Wind farm owners enjoy windfall profits at taxpayer expense. Green is very attractive when there are greenbacks involved...The article overall is whether wind farms are in fact an environmental win over gas consumption. It's a difficult issue, and often involves comparing apples to oranges (noise and bird death to chemical pollution, for example). But it certainly clouds the issues to have such large tax-payer-backed government subsidies. Just think how much that $325 million in subsidies could have done if it had gone directly to protecting wildlife or unspoiled areas.
Thursday, November 27, 2003
Tuesday, November 18, 2003
Microsoft security architect Carl Ellison said that one popular culture problem is that security just isn't considered cool. (ref)
That's not borne out by the security experts I meet outside Microsoft who revel in their elite status among geeks. Among sites like SlashDot security is very cool - "black hats" and "white hats" find and identify bugs for the coolness factor. There are also many, many books on security, including some really bad books - an indicator that the publishers are so hungry for books on the topic that they'll publish garbage and people still buy it.
Wednesday, November 12, 2003
San Mateo's putting a wireless umbrella over the whole California county so that the cops don't have to come back to the police station to synchronize. They say they were spending as much as three hours per day driving back and forth to the police station. It completely changes things. (ref)I'm sure out of those three hours gained, one is lost again browsing the Web and another is lost again in IMing with buddies, but that's still a significant productivity gain.
Tuesday, November 04, 2003
I'd like to take the contrarian view that this isn't true but I'm not sure. I first argued weakly that perhaps we weren't really getting desensitized, that she had a sample size of one looking at herself and her reactions to movies and TV news. But I didn't really believe that argument myself - it seems obvious that if you see all of the Terminator movies you're not going to be shocked by Total Recall. My next ineffective argument was that perhaps video violence serves a useful purpose - that we're less likely to panic the first time we see somebody get in a car accident in real life, and we can respond more effectively. But that argument is actually the opposite of my first argument, assuming that desensitization does happen. Apparently studies have been done but I haven't seen anything particularly convincing either way.
It does occur to me that perhaps desensitization isn't the problem we think it is.
- How broad an effect does desensitization have? If you can watch Lethal Weapon and cheer when Riggs kicks the drug dealer in the nuts and breaks his arm (scene 19C), does this mean you'll cheer when you see somebody do that in the street? Not bloody likely (pun intended). Without the setup of movies, seeing this for real would be terribly upsetting, shocking, and not likely to make you cheer. So maybe desensitization is not that transferrable.
- What effect does desensitization have? Sure, we don't feel so much like vomiting after years of watching gut-churningly violent movies. But does it necessarily make us approve? Or does a movie like The Gift make us consciously condemn the wife-beater even though that role is played by Keanu Reeves?
- How does violent entertainment affect us differently? Surely there are many factors that change how violent entertainment affects us, such as age, upbringing, gender and personality. So I've watched a bunch of Jackie Chan movies, and now I take karate class and love it. I don't think I'm more likely to hurt somebody else now though, except in self-defence, and maybe not even then. Is it only a minute fraction of the population that is more likely to commit violence through exposure to violent entertainment?
- Eric Rescorla for everything
- Jim Whitehead and Greg Stein for getting me started, reviewing chapters and writing blurbs
- Clay Shirky and Larry Masinter for writing more blurbs
- Radia Perlman for encouragement and having me in her series
- Brian Korver, Elias Sinderson, Andrew Sieja, Yaron Goland, Geoff Clemm, Terence Spies, Kevin Dick, Gary Gershon and Andrew McGregor for technical reviews and advice
- Peter Raymond and Rick Rupp for diagram ideas, Keith Ito for his tracer tool
- Everybody on the various WebDAV mailing lists
- Rachel Gollub and Rob Alvelais for friendship and support
- Prentice Hall editors and production staff: Mary Franz, Noreen Regina and Anne Garcia and Techne Group's Dmitri Korzh (for their patience)
- My mom and dad
My amazon sales rank is 863,536 so far. Whatever that means.