Friday, April 28, 2006

Need an excuse to visit SF?

[via jwz]

June 6th at DNA Lounge: Front Line Assembly / Stromkern / DJ? Acucrack / Deathline Intl

Now that has got to be a good show. They are showing in Seattle Sunday night at the Fenix (sigh... will someone buy them new speakers which actually sound good in that space?). Of course, I'll just have returned from a week on the East Coast, introducing Natascha to NYC and hang'n with college friends at a wedding in Bumblefunk, Pennsylvania. Arggg. I may just be back in time for the show... maybe.

Thursday, April 27, 2006

VS Vitriol

It seems my post about VisualStudio 2005 hit a nerve.

It really is an interesting lesson to see how different people respond to VS 2005. Compared to VS .Net (Version 7.0, the release before VS 2003), VS 2005 is a huge improvement. The only explanation I can give for that release was that Microsoft's Developer Division wanted to make sure that the update would look really good, because the initial .Net release was so abysmal that most developers I know at Microsoft avoided it at all cost. Part of the problem was that VisualStudio .Net tried to integrate the VisualBasic development experience with the Visual C++ experience. Kind of like the lava-lamp I had in college... the two just don't really mix.

Visual C++ was all about fast and minimal. It had decent Intellisense (which I still can't get working in Emacs... sigh) but was fast! I would use it instead of Notepad, because it handled multiple files and non-MS end-of-line encodings. VC++ 6.0 had shipped when I started at MS in 1997 (I don't know the exact date it shipped...) and VS .Net shipped in what.. late 2001, I think? That is a long time. VS .Net may have performed eqivalent to VC++ 6.0 when it had originally shipped, but people were used to running it on screaming fast development machines. I remember the first time I used VS .Net. I couldn't beleive how long it took to load. I felt like taking a coffee break, just waiting for it to start up and open a single file. And it crashed, and crashed, and leaked memory like a sieve.

Compared to that, VS 2005 is a wonder of engineering beauty. For basic C# development, it is faster than any of the Java IDE's I've been using. It finally provides basic refactoring support (which is all I ever really find useful...). It's project files are in XML and actually almost make sense. Then again, the only reason I care that the project files are in XML is because the project UI is so annoying that I often just resort to manually editing the project file. If you ever want to build a project that has tens of file includes, do yourself a favor and edit the project file directly.

The real problem with VisualStudio, is that it is a huge kludge of what used to be Visual C++, Visual Basic, and Visual InterDev... all into one package. Half is writen in MFC, the half stolen from Office and another half writen from scratch (some in C++, some in C#). It is actually amazing that it works. It is virtually imposible to implement such a huge beast in a clean and efficient way. Even if you wanted to, you would then have to spend man-decades, trying to port all that existing code to the new framework.

Ultimately, VS suffers some of the same problems as Windows Vista. It is too sprawling and too full of legacy crap to have any kind of coherent plan.

One of the reasons why Linux has been able to eat Microsft's lunch in some markets is because it is so much smaller and less dragged down by legacy. I don't mean support for legacy devices, I mean legacy apps.

Back to VisualStudio, the problem with day-dreaming about a replacement is profits. There is very little profit in shipping development tools. Especially, now with Eclipse and NetBeans setting the price bar at $0. A good development tool, something even remotely clost to VisualStudio is a huge effort, easily into the man-decades, possibly man-centuries, of development effort. What company in their right mind would fund that? IBM funded Eclipse because they needed a good IDE to replace their aging VisualAge series.

On that optimistic note, I'll just enjoy the fact that VS 2005 doesn't actually crash much at all, and pray that someone ships a real C# environment for Eclipse.

Wednesday, April 26, 2006

VisualStudio ...oh my...

I've had Visual Studio installed on my machine for less than a month. Already, MSDN is completely broken. What kind of product can bork itself that quickly? I'd uninstall and reinstall, except that would require shutting down, installing the dvd drive, booting up, installing MSDN, shutting down, putting the spare battery back in, and booting up again. Given near universal internet access, I can live with msdn.microsoft.com. MSDN documentation is horribly organized and the search engine is for crap, but I would like _something_. No wonder Microsoft stock is moribund. What developer wants to waste time with this crap? It is embarrassing (for Microsoft) that I've had such better experiences with Sun's JDK and Eclipse. Sure, VS is faster, but what good is faster, when it doesn't work?

It isn't just MSDN that is broken. Every once in a while (I can't track down what triggers it), something goes wrong with my solution, and VS stops applying the defines! I have a common code-base that uses include links to be shared amongst 3 different projects within a common solution. Basically, there is the release build, the unit-tests (which need access to the internals), and a compact-framework build. There are a few parts of the main code that are either tweaked (for the unit tests) or disabled (for compact-framework). Everything will work like a dream until randomly my tests start failing because the #if block for the tests is getting skipped. Closing down VS and restarting it seems to fix the problem. It seems like there is some caching going on, and the cache management code doesn't realize that while it may be a common path, due to the defines, it isn't identical code. sigh.

Stuff like this is why they need to toss the VS code-base. Microsoft needs a new IDE targeting low-level coders. I don't need design surfaces. I just want a fast editor with debugging and basic refactoring support. Unfortunately, I'm fully aware that this is actually a great deal of work, but there is something just fundamentally wrong with VS as it is today.

Tuesday, April 25, 2006

More Developers != More Features

John Dvorak has published his normal provocative, incendiary ramble, this time about "The Great Microsoft Blunder", referring to Internet Explorer. As a developer, I can't help but scoff at the idea, as IE 4 was leages ahead of Netscape (and yes, that was before I joined the company). As a stockholder in Microsoft, I can't help but nod my head in agreement with some of his points. But the quote that I saw published around, that I want to take issue with is:
All the work that has to go into keeping the browser afloat is time that could have been better spent on making Vista work as first advertised.


Having watched the train-wreck that was Vista from the inside, I can tell you that not having IE would have don't virtually nothing to help Vista ship as advertised. Anyone who has read any modern management book (or lived in the trenches) more people does not equate to more features. IE may have pulled a few people off of Vista, but it impacted Microsoft Windows Presentation Foundation many times more. Vista isn't neutered and delayed because of any lack of people. It is a mess because of middle management. In a large project, complexity is a logarithmic on connections. 3 projects is 2x as complicated as 2 projects.... roughly.

It gets worse. At least at the time I left (6 months ago), I saw very little evidence that the management of these failed projects was getting any kind of a slap on the wrist. The problem is that in an org that large with that my dependencies, it was impossible to tell which team was actually mismanaged, and which team just was dragged down by the other mismanaged teams.

Vista could never have been what was originally sold to audiences oh-so-many years ago. It was not even vaguely possible to deliver that pipedream. You just can't replace a simple, straightforward design with one filled with indirection and abstraction and claim that it will perform equivalently. Rather than building on top of successful designs, Vista was all about throwing what worked out the window and starting over. You can do that with a few parts of a large release and get away with it, but if you do that with too many components, then you just have a flailing juggernaught that doesn't know how to stand up on its own anymore.

Witness: Avalon, aka Windows Presentation Framework. This was supposed to revolutionize gui app development. Well... last I heard, it was still plagued by performance issue, and is really just a better WinForms. I've used WinForms... that isn't that hard.

Witness: WinFS. Just thinking about WinFS makes my blood boil. The core has some amazing ideas, and makes me drool with excitement, but the execution is so bad that I feel like I'm watching someone try to climb to the moon by hitching themselves up with their own bootstraps. Every time I look into what WinFS is today, they have redefined the mission, and every time it is less and less interesting. I've given up. The W3C's crazy Semantic Web idea has more likelyhood of having real impact that this mess.

Vista is full of goodness too. The new window manager will rock. I'll still just feel like I'm using Mac OS vCurrent-2, but it is getting there. There are a number of security fixes (far beyond the much lamented authorization pop-ups that everyone is talking about.) that I want on my desktop ASAP. And then there is IE7, which would matter to me if I didn't just use Firefox. Too little, too late.

Monday, April 24, 2006

Beware Bags of Bools

OSNews is running an article about using Finite State Machines in your C++ code. The write-ups are only marginally interesting in my opinion. (I prefer more efficient implementation strategies, although that does equate to more implementation effort.) It did remind me about one of my pet peeves and a lesson I learned while having to maintain other people's code:

Beware bags of Bools! (Do not storing state as a collection boolean variables.

If you have ever had to maintain non-trivial code that uses a proliferation of flags to indicate the current state, you know what I mean. Does flag foo matter when flag bar is set? Or you fix a bug only to find that your fix breaks some esoteric case where the code can't handle that particular combination of flags. Worse, each flag you add doubles the total 'state-space', i.e. the set of potential states that the code has to handle. Just 5 flags equates to 32 potential combinations. Are you testing every code path with all 32 potential combinations? I didn't think so.

The solution? Store the majority of you state as an instance of an enum. All state changes should be of the form:

switch (currentState)
{
case State.PreInit:
init(); break;
case State.Closed:
throw new error("already closed");
default:
break;
}
currentState = State.Foo;

Now when you look back at the code in 2 weeks (or 2 years), you know what states you though needed some sort of setup, what states are invalid, etc. You can add new states and know (mostly) everywhere that you need to add new cases. Most importantly, when writing/debugging the code initially, you can see what states are handled and what states are not. When I would code-review code like the above, I just need a list of all State values, and I can walk validate that the code isn't missing some scenario. Our brains can easily handle walking an enum of 32 items. We can not manage the same when there were 5 boolean flags. Just try it. I dare ya!

Being an XML guru (of sorts), I tend to write a disproportionate number of XML-ish parsers and writers. State management like the above has become invaluable to me. Have you ever tried to implement the full System.Xml.XmlReader API, with it's funky MoveToAttribute() and MoveToAttributeValue() modal behaviour? I've implemented it multiple times. My first attempt used flags. My 'cleanup' shifted the code to use explicit states. I found and fixed numerous bugs, just in the cleanup! Writing the code with explicit states made the final code dramatically clearer. Often when debugging, the resolution was simply that I had forgotten to handle one state. Add a switch and away you go. Even better, because the states are explicit, I know I'm not impacting some other scenario.

Do a favor for the future maintainer of your code. Don't store your state as a grab-bag of bools.

Sunday, April 23, 2006

Thomas Dolby

Thomas Dolby performed here in Seattle, at the Fenix Underground. It was an amazing show. He played a lot of his classics, but what really made it, was that he really was up there performing almost all of it live! He also had a custom VJ rig with what looked like 2 iSight cameras mounted to his gear, and another mounted to his headset. During the show the video screen would switch between the various cameras. It was amazing to watch from the head-cam and be able to see what he was doing with the various instruments.

He was surprisingly chatty with the crowd, and did a great job of really making the show feel interactive. Twice during the show he built a song up by playing the various layers individually and just slowly, sequencially adding layer upon layer. He had a guy hidden behind his gear who was helping run the computer (a Mac) that choreographed it all, but it was still really amazing to see. At true geeks show.

The weird part was talking with my girlfriend after and realizing that she had no idea who he was going into the show. How could one not grow up worshiping the man behind "She Blinded Me with Science?" What a difference just a few years makes! Despite that, she enjoyed the show, dorkiness and all.

Saturday, April 22, 2006

Liquid Crystal Goodness

I've been spending more time working from home recently, much to the delight of my cat. I've been noticing some tightness in my shoulders though, which I attributed to the awkward posture of spending an entire day hunched over a laptop. So I splurged. Wednesday, a gorgeous 20" LCD display arrived from Dell. I've been a big fan of LCD displays for a long time. I first got one back in ~98 or so, due to limits on space. I've used them at home exclusively since. This new display is huge compared to that original display, but it also managed 1600 x 1200. Me likes it. That is big enough that I can have either Eclipse or Visual Studio open at less-than-full-screen and they are still usable! The joy of it.

The new monitor also came with the definitive proof that I'm not a gamer. My old graphics card could not drive the monitor. Oddly, it could handle 1600x1200 at 70Hz, but not at 60Hz. So I popped on over to Computer Stop and picked up a new card. What was interesting was pulling out my old card. A GForce2 AGP. I think that dates from 2000 or so. It was great for an hour of Need For Speed, back in the day, but not exactly up to modern standards. Now I have a fancy new card that probably is more capable than the elderly Athlon telling it what to do with itself.

Word of warning: I also picked up a Belkin "Compact KVM" so that I could use a USB keyboard/mouse combo and be able to plug either my Dell D610 or my 12" PowerBook in to that luscious monitor. Not recommended. It occasionally gets freaked out by something and decides to not to allow me to switch to my laptop. Apparently it overrides your manual override when it doesn't think there is a device there. grrr. 30 minutes with their tech support just proved that the device is crap.

Tuesday, April 11, 2006

StAX of the future

In his blog Paul Sandoz just posted an entry about what he would like to see in store for the StAX API in the future. I worked with a number of people on the typed extensions to the XmlReader API in v2.0 of the .Net Framework. Adding typed extensions to a heavily text-oriented API (as all XML parsing APIs tend to be) is a challenge, but I agree with Paul that StAX is the place to add it. It is almost imposible to add a good typed extension to an API like SAX, where the parser is in control. But with an API like StAX, where the application is in control, it can be added without interrupting the original API.

Off the top of my head, the issues with adding a typed API to XmlReader that I remember being difficult:

  • What happens when the content isn't already typed? One of my goals in the XmlReader APIs, was that a client of the APIs should not need to know if the data on the wire was typed. That way the client can have the same code for text-xml and binary-xml. Also, that means that the typed APIs extensions serve as potentially useful utility methods for all users of the API; not just users building on a non-text serialization parser.
  • How to handle comments and processing instructions? XQuery really made a mess of this, in my opinion. According to the XQuery data mode, the decimal value of these two elements is both 12: 1212. A user of the API should be able to do something like this:

    reader.readNext(ELEMENT, nsNone, "int");
    int value = reader.readValueAsInt();
    reader.require(END_ELEMENT);

    Should the comment be exposed after the value, or just lost entirely?
  • Dates... There are lots of ways to format dates, and all sorts of complications with time-zones. The 'standard' for XML is ISO 8601, but that isn't the default for any of the date/time classes in Java/.Net/Python/etc, as far as I know. In .Net we settled on requiring ISO 8601, and forcing other formats to manually parse the text.
  • Where to end. Where should the parser be positioned when the call returns? If you want to support skipping over comments and processing-instructions, it must move to at least the clost-tag. But is it even necessary to leave it there? Why not skip past that? it really depends on the other methods of the API.
  • What happens if the content does not match what is expected. This may mean that the element has sub-elements, that the content is empty, or that it can't be returned as the type requested. Where is the parser positioned after the error?


I think StAX is a significant improvement over SAX, and would love to see this additional evolution happen. Today the XML APIs on the Java platform seem to be either too low-level (SAX) or too abstract (JAXB). StAX with some typed extensions and some helper methods to simplify it's use in real code would go a long way to filling some of that gap. The important evolution of current parsing/serializing APIs should be about simplifying the code that the client must build on top of the API. Most people writing code using these APIs are not XML gurus, and the API should make it easy for them to do the right thing, and hide more of XML's complexities.

Thursday, April 06, 2006

The Trinity of DOM

Back when I was first working the Microsoft's implementation in MSXML, the spell checker used to try and 'correct' DOM to Doom. Our W3C DOM representative at the time (Rich Rollman) and I used to get a hearty chuckle out of that. We invested a great deal of effort into trying to build a usable implementation of the DOM, that was conformant with the public spec, but included extensions for things like Namespace and usability improvements (like the text property, or the selectNodes() method). Fast forward 7+ years.... XML-DEV currently has a lively thread about the DOM, and the below is my recent contribution. I thought some of you who may not monitor that mailing list might find it interesting, so I'm also publishing it here.

When thinking about the DOM, I think it is worth breaking the users into 3 groups:

  1. Browser (Javascript) HTML DOM
  2. Browser (Javascript) XML DOM
  3. Java/C++/C# XML DOM


The issues with the DOM for group (1) isn't the DOM api, it is incompatible browsers. If the browsers ever 'fixed' that (they won't.. it is always in someone's best interest to have features that the others don't have), then the DOM api might benefit from some tweaking, but the api is pretty heavily tailored to their usage and isn't really all that bad.

Group (2) users would definitely benefit from E4X, when it becomes standardized. No future google/flickr/myspace is going to succeed by requiring people to upgrade their browser, so while E4X might be a productive toy for a few lucky ones, it isn't going to be anything better than that until IE ships it and a few years have passed.

Group (3) users are the ones who really suffer unnecessarily. The DOM is a freaking pain to use and has been an hindrance to XML adoption. Worse, DOM's awkwardness has pushed developers to roll-their-own APIs. I see this as a negative because the majority of these that I have seen (and I got to see a lot while on the XML team an Microsoft) got basic XML-isms wrong. XML looks so effortlessly easy, but is full of hidden gotchas that take careful (sometimes repeated) reads of the spec to understand. (On a side note, I find it amusing that 8 years after the XML spec was released, I'm still relying on my SGML history to explain _why_ bits of the XML spec are the way they are.)

XLinq, XOM, etc. are a good thing. It really is to bad that dot-Net and Java both standardized on the DOM as their standard API. Microsoft, being Microsoft, extended the DOM with methods like selectNodes and selectSingleNode, and you know what? That is a good thing. The W3C DOm committee should look at some code written using selectNodes vs their official APIs. selectNodes is easy to use and actually encourages more robust code. I can't count how many times I've looked at customer code that would break if there was a Comment, PI or CDATA where it expected just PCDATA.

The ultimate test of an API is whether it enables non-specialized developers to get their task done, such that their code is reasonably robust. XML makes that damn hard. The XML DOM, per spec, makes it worse. I don't think that many of the alternatives really resolve that. E4X may be the best attempt yet. XLinq has some excellent qualities, but it is too closely tied to C#, and tries harder to be everything to everyone. XML is used in many different ways that place competing demands on the APIs. The best APIs pick a target set of scenarios and heavily prioritize those. That is hard to do in a committee and the DOM demonstrates that.

My rant done, I want to say that the DOM was a critical keystone in getting XML adopted as it was been. For all its flaws, the DOM set a common bar for XML in-memory stores. SAX did the same for xml parsing. I dislike the phrasing 'worse is better', but it definitely is true that 'shipping is better than perfection'. Nothing real is perfect. The fact that a number of people put the effort in to creating the XML DOM spec has been a factor in XML's success. For all it's flaws, more that likely, the world is a better place for having it here.