Wednesday, November 29, 2006

Not dead yet!

Things have been a bit screwy around here lately; therefore, I haven't had time to finish up the post I'm working on right now. Maybe in a few more days.

At least I finished A Clash of Kings.

Friday, November 24, 2006

Lots of stuff, all mixed together!

I recently commented on the idea of automated page expirations. Someone mentioned the oft-promised dewiki stable versions proposal in respect to this. Almost immediately after hearing about stable versions I realized how powerful and useful they would be for Wikipedia. They were promised for delivery "soon"; the timetable was presented as sometime in October, maybe even earlier, right after single-user login, which was supposed to be available "very soon" in August. Well, it's November, almost December, and single-user login isn't here yet, and neither are stable versions. My patience is starting to wear thin....

Another commenter suggested that Wikipedia has too few editors to have "content administrators". Wikipedia has thousands of editors. I find it hard to believe that there aren't enough people in those thousands who would be willing to do that sort of work if asked. Wikipedia does a poor job of managing its volunteers, and especially of advertising ways that volunteers can help the project. There is some sort of weird belief that it's wrong to ask people to do things for the project, or expect people to hold to their promises to do things, once made. Are Wikipedians afraid of commitment? (Is this a consequence of the relatively young age of most Wikipedians?)

The discussion of this brough to mind another idea, one Greg Maxwell had: have a bot go through the database, marking (at a relatively slow rate) pages which lack sourcing with a tag that declares that the article is not sourced and will be deleted if not sourced within one month. Any page which, after that month, is still unsourced is then deleted. The tagging rate has to selected to balance the desire to get all articles sourced in a reasonable time (there are something like 250,000 articles which lack sourcing on the English Wikipedia) while at the same time not putting too much demand on the editors who know how to properly source material. Pretty much everyone I've talked to about this thinks it's a good idea. I believe the only reason Wikipedia is not currently doing this is the lack of a working enwiki replica on the toolserver (another issue the developers seem not to care about very much).

The Esperanza dispute that I wrote about was deftly deconflagrated by Kim Bruning, who quite wisely recognized that no consensus could possible arise from that train wreck of an MfD and closed it early. I understand that he was criticized for doing so by a few sore losers. Kim's a tough cookie, though; he can take the heat. I can only hope that in the long run some good comes out of the secondary discussion.

I've had several people ask me recently if they should run for ArbCom. Why are people asking me this? Maybe they should listen to what I had to say in Episode Five of Wikipedia Weekly about the elections and people who think they might want to run for Arbitrator. Still, I'm not happy about the current slate of candidates: of the thirty-one candidates who have declared so far, I really think that only two or three are qualified and suitable. There's maybe a half dozen more who I don't know well enough to know whether they're suitable or not; the rest are, in my eyes, unsuitable in some way or another, some very much so. There's five seats open, which means that the chances of electing an unqualified candidate are quite good. And while frankly I think the ArbCom is losing relevance, it still remains an important entity; it would certainly not be a good thing for the ArbCom to become home to one, or worse yet, several, people who have no business sitting in judgment on their fellow editors. (By the way, the people who've asked me have generally gotten back a generalized "meh". Nobody who I really think belongs on ArbCom has asked me if they should run. Fortunately, nobody who I really think shouldn't be on ArbCom has either: I haven't had the displeasure of telling someone that they really shouldn't run.)

In a related topic, Werdna mentioned to me that he's working on an essay on how to improve Wikipedia. He's pretty much spot on on a lot of the issues. I'm looking forward to seeing what his solutions are, beyond exhorting readers to behave differently.

Onto another topic entirely: in last week's Wikipedia Weekly we briefly talked about universal wiki markup languages. Some listener mentioned Wiki Creole to Andrew, who shared the link with me. This project is working on developing a subset wiki markup that would be supported across multiple platforms and allow editors moving from wiki to wiki to use the same basic markup set no matter where they are. A MediaWiki implementation is promised. I noticed they've not made the MediaWiki mistake of using quotes for both bold and italics, but they still have the equals problem (both of which introduce unlimited lookahead, although in the equals case it's at least bounded to end of line instead of end of document, and the equals problem can be avoided if you have reasonable error recovery semantics, which MediaWiki doesn't).

Happy Thanksgiving, everyone (in the US, at least). Don't forget to check out next week's Wikipedia Weekly on Monday!

Tuesday, November 21, 2006

Of WikiCouncils

My gnomes have mentioned to me that there has been, since the October board retreat, some discussion about some sort of "WikiCouncil", consisting of respected members of the various communities, brought together to discuss issues and advise the Board. The exact details of the proposal vary depending on who makes it, and from moment to moment, but in general there seems to be some significant support for this amongst most of the vested parties at the Foundation level, with one notable exception: the Board's newest member, Erik.

I have long supported having a council of this sort. A council of Wikimedians, chosen by the communities to represent them, and wielding at least some influence over Foundation matters, would go a long way to restoring the sense that the community has real input into the operation of the Foundation—something which has, of late, been largely lacking. In the long term and in the ideal case, I actually think that an assembly of delegates elected from the various project communities (with the number of delegates from each community being decided by some fair and agreeable method and the method of election left largely up to each community) actually replacing the Board as the ultimate controlling authority would be the best model for long-term governance of the Foundation. (This assembly would meet once a year, immediately before, during, or immediately after Wikimania, and elect a Board, which would then govern between these annual meetings.) The Foundation is probably not ready at this point in its existence to move to such a governance model, however, and I do not today advocate it. However, efforts to establish a possible precursors to such an entity would be a good thing, in my eyes, not only for the potential it would offer for a long-term move toward true community governance of the Foundation, but also the shorter-term value of better community input into the strategic vision of the Foundation with today's governance (which quite frankly has almost no obligation to the community at all).

I freely admit that I have not read any of the discussion surrounding this concept on the Wikimedia public mailing lists; those lists are full of sound and fury, signifying almost entirely nothing. In almost every discussion on a public Wikimedia mailing list, the discussion is dominated by the noisy fringe, and in general the reasonable people bow out of the discussion because they do not wish to argue pointlessly with these fringe elements, who in many cases border on trolls. Nor have I read any of the private discussions—even when I was still involved in Wikimedia activities I only had access to the ComCom's mailing list, and certainly never to the vaunted internal-l or private-l, which is supposedly where all the real dirty business takes place. The Wikimedia Foundation has long been the home to some very ugly turf games and internal politicking, and I see no reason why that would have gotten any better with Erik's election; in fact, I would expect that it has gotten much worse. (I cannot begin to imagine how ugly the politics of the executive search and the discussion regarding Tim Shell's replacement must be.)

What I find interesting, though, is Erik's strident opposition (and I characterize it as "strident" based on what my gnomes are telling me; I have no direct experience with it myself, as I generally avoid talking to Erik, for reasons I may share at a later date, and as mentioned I have not read the mailing lists). I can only assume that this is because Erik feels that such a council would diminish his own importance. Sadly, this should not be a factor that a responsible member of the Board of Trustees of a nonprofit organization would use in decisionmaking for that organization, but I have seen decisions made for the Foundation in the past with equally questionable motives, as well, so I shall not waste too much time applying the tar and feathers to the Board's newest member. But I most definitely do call into question his motives for opposing what is quite clearly a very sensible means of establishing a viable channel for community feedback into the Foundation, something which both the Foundation and the communities need, in order to ensure that the Foundation is actually serving the communities' interests and that the communities feel that they have some real stake in the operation of the Foundation. It is my understanding that virtually everyone at the board retreat—save Erik—supported some sort of council. And yet, now that the retreat is over, Erik is building a grassroots campaign against what would normally appear to be the sort of thing the grassroots would want. (Is this grassroots, or astroturf? Hm.....)

Anyway, very curious situation. Makes me glad that I'm just an observer, not a stakeholder.

Monday, November 20, 2006

How to make MediaWiki better

MediaWiki, at least when used to support Wikipedia, could use at least two additional features.

The first is a workflow system. Right now, there are four outcomes that can occur when someone goes to edit an article. First, the user may be refused permission to edit at all. Second, the edit may be accepted and stored. Third, the edit may be rejected due to an edit conflict. Fourth, the edit may be rejected because it contains content deemed inappropriate (e.g. links to a "bad image" or to a spamblocked URL). I would like to have the option to add a fifth option: defer the edit until it has been reviewed by a content administrator. (Edits made by a content administrator would possibly bypass these checks.) This could be used to deal with editors who are on probation, or for edits which insert certain URLs which are frequently, but not always, spam (e.g. youtube, myspace). Right now, Wikipedia cannot add to the spam link list because doing so makes editing [[YouTube]] impossible. This would allow people to edit articles containing questionable content, subject to a review process. (A generalized improvement of the methodology by which certain content is unacceptable would be useful too.)

The other addition would be the ability to add an expiration date to an article. This would be useful in the case that an article contains content which will be known to be out of date at some future date. Upon expiration, the article will generate a work queue item. Users with privileges on a work queue will be presented (upon request) with items on the work queues they have access to and asked to resolve them; in this case by updating the page. This could also be used to handle, e.g., AfD closures, RfA closures, proposed deletion, and lots of other things. I believe there's already a tasks extension, which is not implemented on the English Wikipedia for reasons that are not known to me, but I think this goes somewhat beyond what is offered by that extension.

I'm starting to think that I'd be better off reengineering MediaWiki rather than continuing the port, keeping these (and other) considerations in mind. MediaWiki's design is rather wedged, and I don't see any real value in continuing off of that design.

Friday, November 17, 2006

Wikimedia needs a TechCom

One of the things I've noticed in the past year and a half or so of watching Wikimedia operate is the way that MediaWiki development (which is at least partially paid for by the Foundation) seems to lurch about almost at random, with development being driven as much by what the devs feel like doing as what Wikimedia needs. Now, I realize that MediaWiki has customers other than Wikimedia. However, Wikimedia is the only customer that is paying them; they should get a much larger say in what gets developed.

On top of that, there is relatively poor communication between the communities (who have the desire for technical changes) and the developers (who are in a position to implement changes). Community-driven changes only take place when someone in the community manages to find a developer and convince that developer that the change is a good idea. This forces developers to be judges of consensus in communities that they are likely not even members of. There is no established mechanism for communities to come to a consensus regarding a change which requires either technical assistance (change of a configuration setting for the software) or software development, and, having come to that consensus, then request that the developers make that change. This isn't to say that such changes don't happen, it's just that there is no established mechanism. Whether or not a community can get a change executed comes down to whether or not they can successfully convince a developer that it's worth doing, which is a battle entirely independent to winning the consensus of the community for the change.

Another problem is the fact that many developers do double duty as system administrators. As a former developer turned system administrator, I will testify that this is one of the worst possible ways to run a development operation. There are endless reasons for this; I am not going to get into all of them, nor do I suggest that Wikimedia's operations team is guilty of all of them. However, I am a strong believer in a clear separation of responsibility between developers and administrators, especially on production hardware. (To be fair, Brion has done a decently good job of managing this, although there have certainly been some very egregious exceptions.)

On top of that, Brion is currently managed by Brad. Brad, while more technical than most CEOs I've run across, is neither sufficiently technical to direct Brion effectively, nor does he have the time to do it on top of all the other stuff he has to do. Nor is there any guarantee that the permanent ED will be even as technical as Brad. Brion is not sufficiently skilled (or, more pertiently, experienced) a CTO to effectively keep Brad appropriately informed on technical matters or, from what I've seen, to manage Wikimedia's technical assets and contracts in a fiscally prudent manner. As as result, the Foundation wastes money on poorly-considered purchases (and, especially, on its strategy of "throwing hardware at the problem" of its grossly inefficient software) and contracts, and doesn't get a whole lot of value out of funding MediaWiki development. It's clear to me that Wikimedia needs to shake up its technical side somewhat.

My recommendation is threefold. First, appoint a true CTO: someone who has the technical skills to manage both developers and operations, without actually having to be either a developer or a system administrator, and also the managerial skills to interface effectively with the nontechnical people in the ED's office, the CFO's office, and the Board. The role of the CTO (a title which Brion currently holds, but he does not really perform the duties of the office) is to direct operations, infrastructure investment, and development to ensure that the goals of the Foundation are being met by those activities, to keep the leadership of the Foundation informed on technical developments in a manner that is comprehensible to them, and to ensure that the directives that are set by the leadership are met in a timely manner by the technical staff and volunteers who report to the CTO.

Second, appoint a Technical Committee (or TechCom). The purpose of the TechCom, which would be a committee operating under the auspices of the Board, is to determine the technical needs of the Foundation and of the communities and convert those into directives to be given to the CTO for implementation. They would do so in consultation with the Board, with representatives of the communities, and with the CTO and other technical personnel. The CTO would probably be ex officio a member of the TechCom. The TechCom would be the entity to establish the mechanism by which a project requests a technical change; once the TechCom has evaluated the request and prioritized it, the CTO then decides how to make the request happen and assigns it to the appropriate teams for implementation.

Third, separate the technical staff into development and operations teams. The development team, led by a Senior Developer (a role for which Brion is probably most appropriate), would develop MediaWiki and other software required to meet the objectives and directives determined by the TechCom and the Board, and would report up to the CTO. The operations team, led by the Director of Technical Operations, would be responsible for maintaining the servers, hosting arrangements, and other such things as are required to maintain the day to day technical operation of one of the Internet's more complicated sites.

The Senior Developer will likely have to do a lot of volunteer coordination, since most of the MediaWiki developers are volunteers. However, it would likely make sense to allocate some budget to either contracting out development of code and/or hiring programmers, especially where such development could increase the efficiency of the systems used by Wikimedia. Current management strategy gives the developers no real incentive to improve efficiency because they have control over both operations and the hardware acquisition budget; therefore, they can simply solve performance problems by throwing more hardware at them. This has resulted in the Foundation being significantly overinvested in server hardware. On top of that, having Brion doing so many different tasks prevents him from doing any of them as well as he could. Divesting him of his operational and hardware acquisition responsibilities would free him to actually develop the code as well as give him time to recruit and manage volunteers for the project, which hopefully would lead to a better completion rate on outstanding projects. We've been promised SUL for what, over a year now, and stable versions on dewiki is now overdue as well. I cannot help but imagine that this is in part due to Brion being stretched too thin; but I also suspect it is due to inadequate supervision of Brion and the other technical resources as well.

Tuesday, November 14, 2006

The ethics of editing encyclopedias, and of deleting social groups

This just in, from Wikimedia Commons: apparently having ethics, and expecting others to have them too, is contrary to Wikipedia's NPOV policy. I've heard some real doozies in my time hanging around Wikipedia, but this one takes the cake. I sincerely hope that rklawton retracts this outrageous statement with all due haste; if he fails to do so he really should be disinvited from the project. The idea that encyclopedia editors are not only not bound by ethical constraints, but must not be so bound, is flatly absurd.

Meanwhile, there is apparently a proposal to delete the Esperanza project on Wikipedia. I commented on Esperanza earlier, identifying it as one of the mechanisms by which nongeeks manipulate social networks, and explaining why it is reviled by geeks. Well, against my advice, the geeks have moved to delete Esperanza's pages out of the project space. The deletion attempt will almost certainly fail, and furthermore it will even more polarize the conflict between the geeks and the nongeeks. I have been encouraging people to quit Esperanza, even denounce it, but to move for its forcible deletion is inherently a violent act (in Meatball terms, a "DirectAction") and as such will tend to polarize the community instead of lead the community toward consensus.

There was no imminent need to delete Esperanza. Whatever damage it was doing was being done slowly, and a more deliberate discussion initiated in a less confrontational manner than the heated, high-pressure environment of a deletion discussion will necessarily force. In general, this is always the problem with deletion discussions on Wikipedia: they presume the conclusion. Instead of discussing generally the merits of the content in question, with many possible outcomes, the discussion is forcibly channeled into an up or down decision on deletion, with a fixed instead of open timeframe for discussion. Both of these structural features work against development of a true consensus, and instead turn the deletion process into trench warfare, which is what the deletion "discussion" of the Esperanza project is clearly turning into. I would have far preferred for there to be an open discussion on the merits of Esperanza as an encyclopedic organization, hopefully leading, at its conclusion, to a consensus decision to disband it, or at least reform it to reduce some of its more odious qualities. But I fear that the current deletion demand will merely galvanize the Esperanzan diehards into defending their plaything, and building up its bulwarks to defend against further such actions.

Monday, November 13, 2006

Playstations, price gouging, and anticompetitiveness

So, Sony, manufacturer of the PlayStation 3, which is being released in time for Christmas, is apparently offering a special deal: pay extra for the box plus a bundle of games of and we'll guarantee delivery by Christmas. This sounds like a neat deal to me, and perfectly reasonable: consumer pays a premium for guaranteed delivery of a desired good. Consumers who don't want to pay the premium can compete with everyone else on the open market for the residual inventory. Because of market competition, of course, they might end up paying more than they would have with the premium, but that's the risk you take with such deals.

So why are people complaining that this is price gouging? First of all, price gouging is not really possible for luxury items like Playstations. Nobody is forced to buy a Playstation; if merchandisers refuse to sell Playstations at prices that the market will bear, they will not sell any and they will not make money. This is not like charging $6 a gallon for gas immediately after 9/11. On top of this, merchants aren't jacking up the basic retail price; they're simply offering consumers the option to pay a premium to avoid the risk of shortage.

The objections being raised to this (see the comments at the Techdirt article) are just silly. Some of them are people who seem to think that they have some sort of right to a PlayStation 3 (well, guess what, you're not). Others are quite likely speculators who are moaning over how this is going to take a big whack at their plans to buy up as many PS3s as they can and then sell them at massive markup on eBay in late December. And, of course, I'm sure the other game manufacturers are less than thrilled; consumers who elect to accept the premium offer will likely buy less of the competition's products. But price gouging? Don't be so freaking histrionic. It's just a game.

Random stuff from other blogs....

First, here we have a proposal to recreate Nupedia. Hey, might work, now that Wikipedia has critical mass, but it'll take a very long time to actually create an encyclopedia with only seven people... Still, the fundamental complaint is failure to maintain neutrality. Maybe someone should point this guy at Citizendium?

Here, we have another editor who claims he has been chased off over neutrality issues. Seems none too happy about, too, and judging by his response in the comments probably deserved what he got. Still, it's a negative outcome; someone should do a RCA and find out why he's so pissed.

Meanwhile, Joystiq writes about vandalism to the Nintendo article.... Yeah, vandalism happens. Maybe someday Wikipedia will implement that "stable version" thing they were talking about at Wikimania. That would be nice. Wasn't that promised by like end of September? Oh, and by the way, just what happened to single user signon?

Saturday, November 11, 2006


I've added a bunch of new colors to my cooperative color naming system. I'm working on a way to group the color list by color (the database was recently sorted alphabetically) to make it more interesting to look at. I also need to convert this to something more AJAXic so I can store the colors in a database on my server and fetch and save them dynamically instead of the rather dumb way it works now.

I'm also working on trying to generate "light" and "dark" characterizations on the fly. Using existing characterizations for dark and light colors, I've been able to discover that, in most cases, the various characterized colors are more or less on a line through Lhabcab space. This means that I should be able to characterize a color as a midpoint vector and a scaling vector, the midpoint being the value of "medium", and then generate light, dark, very light, very dark, etc. by adding appropriate multiples of the scaling vector to the midpoint vector. I've determined these for sapphire and emerald so far. The linear interpolations generate pretty good matches. It appears that "deep" means the same as "very very dark", for example (both are a lightness index of -4). In case anyone is interested, sapphire has a midpoint vector of (67.332, 245.564, 0.368) and a scaling vector of (9.227, 2.699, -0.131), emerald has a midpoint vector of (76.166, 56.050, 0.584) and a scaling vector of (4.416, -5.959, -0.075), and amethyst has a midpoint vector of (70.415, 332.846, 0.226) and a scaling vector of (8.125, 3.357, -0.085).

I'll probably generate another doohickey sooner or later to display the results of my meanderings.

Friday, November 10, 2006

The last few days...

I haven't had a lot to say here the last few days. Mainly, I've been tired, between a busy week at work and a busy week of watching politics on TV.

Yes, yes, I'm quite happy with our new Democratic overlords. And I'm impressed with Senator Chafee for telling George Bush where to stuff his nomination of Ambassador Bolton. Lets hope that someone will stand up against adopting that hideous domestic surveillance bill that Bush also wants adopted before the end of the term, too. (So much for bipartisanism. Not that we expected Bush to actually be bipartisan....)

I was thinking about writing up comments on the candidates for the Arbitration Committee, but I don't really see the point. I've lost most of my respect for the Arbitration Committee, and don't really care who gets elected to it. Even if I ever do start editing again, I don't think that I'll have much need to interact with them. The current candidates, last I looked, ranged from lackluster to atrocious, with nobody in the entire lot that really inspired me to positive comment. I've since removed the RSS history feed for the page from my blogroll, and am officially paying no further attention to the topic. My single comment to voters: do not elect good writers to the ArbCom. Unless they use speed, or have no lives at all, they will not have time to be both good writers and good arbitrators. In most cases, someone who is a good writer is better off serving Wikipedia as a writer than as an arbitrator. And given the increasing lack of relevance of the Arbitration Committee, it seems like a double waste to put a quality writer on it.

The MediaWiki port moves slowly. Semirelatedly, I am looking at JavaCC, both as a possible tool for developing a markup parser and as a tool to assist in writing a generalized PHP to Java converter. Unfortunately, I haven't been able to get the JavaCC Eclipse plugin to work right yet. This is not critical, but it qualifies as annoying, and I don't like annoying.

Product endorsement: Sensaphone makes very nice infrastructure monitoring systems. We deployed one of these at work, and it is a very nice, easy to use system, and the monitoring devices are also very reasonably priced, as well.

Oh, and kudos to Liebert. I mentioned them a few days ago in my blog (without even linking them!), in my rant about Wikipedia's lack of an article about drycoolers. Apparently their marketing department pays attention to the blogosphere, noticed my post (in this truly insignificant blog), and forwarded it on to an engineer, who sent me some materials that included information about drycoolers. As this appears to be sales literature, I will gladly forward it on to anyone who wants to use it to improve Wikipedia's content about drycoolers, or about precision cooling generally (or even Wikipedia's nonexistent article about the Liebert Corporation). Anyway, good customer relations, there.

Wednesday, November 08, 2006

Presidential product placement?

In Bush's comments to the press today, he included the phrase "carnage on their Dell television screens". Why in the world would George Bush mention Dell in this context? Why is our President giving product endorsements during press conferences?

Monday, November 06, 2006

Victory over the evil Apache Proxy!

Adding "AddDefaultCharset utf-8" to my proxy definition in httpd.conf dealt with the minor issue I've been having with some page rendering badly because of UTF-8 content that apparently was being interpreted as ISO-8859-1. I am much happier now.


Someone mentioned on IRC that someone proposed a community ban of SPUI. This is an extraordinarily stupid idea, for reasons that will be obvious to anyone who has taken the time to understand SPUI. Below is an email I wrote to the ArbCom about how to manage SPUI. I am reproducing it here (slightly edited) for the general betterment of mankind. This was written on June 30, 2006.

I've been evangelizing on IRC about how to handle SPUI for some time now. Basically SPUI is a continuous breaching experiment -- he is constantly testing the limits of our collective patience. He is going to do this; that is the nature of SPUI. The problem is that banning him -- or any sort of long term block -- will just encourage him to try to subvert the system in less obvious ways. It is my opinion that we'd rather have SPUI editing as SPUI rather than as an army of SPUIsocks, which I firmly believe is what would happen if he ever gets blocked (and remains blocked) for an extended time. And we all know how hard it can be to deal with that sort of thing.

Fortunately, one thing about SPUI is that he stops when he finds the boundary. So really all that need be done to manage SPUI is to "smack him on the nose with a newspaper": give him a short block (3 hours to 1 week, depending on circumstances) whenever he goes too far. He seems to accept short blocks in good grace (he knows what he's doing, after all) and after his blocks clear he almost always goes back to his usual editing, which, by and large, is productive and useful.

Psychim complained to me [on June 30th] that if we only block him for a short time when he does these things, then SPUI will have won. This is the real problem here: admins who see this as a contest. Too many people are now trying to "beat SPUI" and that has resulted in a great deal of misbegotten vehemence toward him as a person, rather than at his individual actions. (This is also a large part of the problem with the highway naming case, actually; a number of people opposing him there seem to be opposing his position simply because he's SPUI, and not because of any substantial fault with the position he takes on the substantive issue. The ArbCom should be giving those people a firm rap on the knuckles for making it a dispute of personality instead of a dispute over policy, but I have little hope that that is what you'll actually do in that case.) Many of our admins have decided that he is a troll (and, to an extent, he is, but he's a manageable troll who happens to be useful), and as a result they no longer give him any semblance of good faith even when he is trying to contribute in good faith. To be fair, SPUI brings this on himself by being so tendentious in his breaching experiments, but that doesn't excuse people from treating with him fairly the rest of the time.

Generally put, SPUI is a valuable contributor who is also a low-level annoyance, and our admin corps is not good at managing people like that, because their operational mindset is one of combat instead of one of management. Whenever he goes outside the boundaries of acceptable behaviour, he should be promptly given a short block (from a few hours to a few days, at the longest) with a clear and polite message as to what he did to get blocked. Hauling him before a firing squad for what would otherwise be minor offenses just riles him up more and allows him to start his usual campaigns (which is also why I think short blocks are a good idea; he doesn't have time to build a campaign against the block if it's only for a few hours). Conversely, NOT blocking him for minor offenses encourages him to commit more major offenses (remember, he's seeking limits, and he will continue to probe until he finds the actual limit). Also, he knows that long (and especially indefinite) blocks won't stick anyway.

(Update: Stupid missing parentheses.)

Sunday, November 05, 2006

Web2.0: For men only?

I happened to be looking at's "about" page. There is there a collage of photographs of their staff. All but one of them are white men. The sole exception, Nicole Williams, holds the position of "Digg Ambassador" (which presumably means that she does public relations; I would not be surprised to find out that she is their receptionist). Her personal profile starts out by describing her as their "Rude Girl", and talks not at all about her professional qualifications, or even about what she actually does for Contrast this with, say, the profile for Steve Williams, which talks in some detail about his other activities.

So, why is there only one woman at, and why is her job so vaguely defined?

Long-term Wikipedia vandalism exposed

Long-term Wikipedia vandalism exposed: "No, I think the fault here lies ultimately with the structure of Wikipedia itself. The website's openness both allows an enormous amount of vandalism to filter through every day, not all of which can be removed within minutes, and relies on ordinary people to catch hoaxes like this and remove them."

Indeed, that is one of Wikipedia's largest faults. Open editing got Wikipedia to where it is today, but left unchecked it will take Wikipedia to where it is going tomorrow: the occasional jewel of a good article, surrounded by a mix of mediocre topsoil and outright stinky crap, like the nonsense in the article referenced in the article above.

Open editing may be a good way to start an encyclopedia, but I am starting to think that it is not a good way to finish one.

New color scheme

I never really cared for the black & gray scheme, but I was too lazy to change it. Well, I got a burst of unlazy today. Hope y'all like lavender. (The background color, however, is very close to one of the colors known as "why can't we be antisocial", which I find somehow appropriate.)

(The other two colors called "why can't we be antisocial" are this one and this one, although I just deleted the second one as it was clearly the result of someone who did not understand the interface.)

Friday, November 03, 2006

Wikipedia as an encyclopedia of technology

Background: At work one of the things I am responsible for is the cooling of the equipment in the computer room. We have a Liebert Foundation XDF cabinet, a cutting-edge design that uses a digital scroll compressor, a local R-407C refrigerant loop to provide cabinet cooling, with heat elimination via a plate condenser into a water-glycol loop that transfers the eliminated heat to a drycooler mounted on the roof. Now, I learned in high school chemistry about gas laws and catabatic expansion and the general principles by which refrigerators work. However, I have never had a reason to learn any practical knowledge of this, or even think very hard about the details of the technology. Our XDF unit has had some difficulties in the past weeks, which have finally been traced to a refrigerant leak and today finally repaired, permanently, we hope. In any case, I wanted to learn more about the technology we're using, and specifically about drycoolers, a technology that I freely admit that I don't really understand. So I tried to research them on the web. Sadly, Wikipedia has no article on drycoolers. None of the industrial refrigeration articles mention them, even though they're widely used in commercial and industrial refrigeration systems. In fact, neither "drycooler" nor "dry cooler" appears in any article on Wikipedia at all. (I still don't know exactly how one works. I suppose I should ask my sales engineer sometime.)

In general, Wikipedia's coverage of technical topics not related to computers, airplanes, or trains, is spotty. The other area where I've specifically noticed this is electrical distribution systems. One of my other responsibilities includes ensuring that the computers get enough power. We upgraded our UPS from a 12kVA single-phase unit to a 20kVA three-phase unit. At the time, I was not really clear on how three-phase works, and so I researched the topic. Wikipedia has ok, but not good, coverage of this topic. It has much poorer coverage of the power distribution grid and the technologies used to accomplish power distribution. The article on power conversion states that "power conversion is the process of converting power from one form to another" and then goes on to list several forms of power that one might convert between. It also lists a few methods used. It does not discuss the relative merits and disadvantages of each method, or talk about which ones are predominant. The discussion of why one might want to convert power is superficial and conclusary. No major vendors of power conversion products are listed. All of these things that I would expect in a decent technical encyclopedia article. The current article is woefully inadequate.

Surely there are some people out there with expertises in these areas who might do something about Wikipedia's poor coverage of commercial power systems and commercial refrigeration systems. I suppose the main reason there is not good coverage of these topics is that these are topics that tend not to be studied in colleges; they are skills normally learned initially in trade schools, and mastered eventually in on-the-job apprenticeships. And these people, I strongly suspect, are not on Wikipedia, and even if they are are more likely to write about professional wrestling, NASCAR racing, or football than they are about the secrets of their trade. Still, there are engineering students who must learn at least some of this stuff (someone has to design power converters and refrigeration systems, and those people generally have gone to college and garnered at least one degree in engineering). Is Wikipedia not appealing to engineering students? If not, why not?