Funny Boo Error Message

Yet another example of how Boo is laid back:

Language feature still not implemented: ‘complex slicing for anything but lists, arrays and strings’. (BCE0031)

I got a good laugh out of that message. I was trying to use slicing in an array in a “macro”. I’m not sure why this error happens though, I’m assuming that the slicing AST is resolving sooner than the macro macro is transforming the body.

The 5 Laws of Code Generation

This is a refresh of an old blog post. The more I look at Oslo and contrast it with what I have been working on I’m trying to verify that NBusiness isn’t actually redundant or made obsolete by Oslo. Obviously the two will be competitors, but what I’m trying to figure out is what actually is the difference between the two and more importantly why is NBusiness the more correct solution to the problem?

I would like to start out by saying that Oslo and NBusiness are both DSLs and core to any DSL is the transformation of the metadata into more concrete code, be that a less abstract DSL, actual executable code or something else entirely. So what I’m calling Code Generation is essentially that transformation process. Additionally what used to be called an “intermediate format” I’m now simply calling Metadata. To me, in this context, metadata is simply the name for the actual DSL declaration for a given application.

So here are those rules again, to be reframed into the context of Oslo and how I feel it may violate those rules.

1. Code generation is controlled through modifiable templates.

Translation of your metadata into another form should never be done through a black box. I’m not entirely sure how customizable the SQL generation of Oslo is but from what I understand it’s pretty opaque. In fact the entire system of translating MGraph into another form should be completely transparent and built such that it is very easy to be shaped into any form. If anyone can explain to me how Oslo translates M into sql and show me how I can do it myself, and alter the SQL that is generated then I’ll be happy to change my mind on this one but it feels rather opaque to me at this moment.

2. Code generation is done during the build process.

I would like to add an amendment to this one and specify that code generation can also legitimately be done dynamically as well. So the new rule could be more accurately changed to “Intermediate forms of metadata should never be persisted”. Not that you couldn’t write it out to temp folders but the point is that the integration of a DSL into an application should be seamless, you shouldn’t have to have multiple manual steps to get it all integrated. Whether this is done at runtime or at build time is irrelevant. Presumably the many command line apps that come with Oslo in order to transform your DSL into something that goes into the repository will be streamlined but this is the sort of thing that should be avoided.

3. Code generation is done from an intermediate format.

I would like to amend this rule to instead be “Metadata must have a single source of truth.” I think that Oslo has a pretty good system for the intermediate format (M) but it doesn’t follow the single source of truth rule. M is simply translated into the Repository, which is the real source of truth but is not necessarily synchronized in the reverse with the original M code (as far as I know at least).

To me Oslo violates this rule simply by the existence of the Repository. The repository is essentially a “second source of truth” and can be edited from multiple sources. To me the DSLs should be the single source of truth and the repository should be essentially a temp file or part of the output of the build. Editing the repository should simply be a matter of editing your entities.

4. The intermediate format is under source control with versioning.

I would like to amend this rule also to specify that only deltas must be part of each revision. So maybe this rule could be simply changed to “The metadata declaration must be versionable and mergable”. Which, usually means that your DSL needs to be textual. I would be willing to buy into a binary format for metadata but only if it had the ability to be versioned with my preferred source control system as deltas not just a giant binary blob and as long as it didn’t break any of the above rules in the process.

5. The code generation templates are also under source control.

This is to place the same constraints on the transformation system as the metadata itself. This will usually translate into the idea that your templates should also be textual. Again, if you want it to be some binary format then there needs to be ways to allow source control tools to persist only deltas. So this rule could be changed to be “The transformation templates must be versionable and mergable”.


Here is a summary of these revised rules:

  1. Metadata transformation is controlled through modifiable templates.

  2. Intermediate forms of metadata should never be persisted.

  3. Metadata must have a single source of truth.

  4. The metadata must be versionable and mergable.

  5. The transformation templates must be versionable and mergable.

Expression Blend for Developers

Expression Blend often times is portrayed as a tool for graphic designers, which is fair enough, but this can be a little daunting for people who are artistically challenged like myself. If you’re primarily a developer you probably want to use a developer tool, like Visual Studio, to do your work. But I might assert here that Expression Blend actually is a tool for developers and not to be dismissed! I often hear one of the following reasons from developers as to why they might avoid using Blend.

  • I’m not “artsy” I prefer to dig into code, none of this drag-and-drop business.
  • I’m working on a business application, making it look pretty is unnecessary, costly, fluff.
  • Having a dependency on .NET 3.5 isn’t worth it, I’ll stick with .NET 2.0.
  • etc.

I can summarize my response to all of these positions by simply saying that I would never choose to do WinForms or an MFC application over Wpf in any circumstances. If you’re not using Wpf, even for simple business applications, even if you don’t have a single artistic bone in your body you’re definitely missing out. I might even go one step further and saying you’re actually making a mistake.

I might even go so far as to say that the same thing holds true for doing Silverlight development over standard Web development. The benefits aren’t nearly as stacked in favor of Silverlight for the web as Wpf is for desktop applications but it’s still a winner in my eyes. Additionally, with Silverlight 3 you’ll be able to create applications that can be run as desktop applications or on the web. That kind of re-usability (of skills and code) is another win.

And Blend is the tool to make these technologies come alive.

User Experience

User experience is not “fluff” or unnecessary. Usability translates directly into money, even for simple business applications. The less usable your application is the larger the training costs, the longer the development time, the likelihood of data input errors becomes higher, and just general unhappiness from your customers can be a result if you don’t pay attention to UX. This is a competitive market, customers more and more have higher expectations from every piece of software they use, especially ones that they’re paying money for.

And believe it or not as a developer you’re definitely going to want to learn a little something about Expression Blend. You don’t need to be an expert and know all of the various tricks and commands but you’ll probably find a huge productivity gain from just from knowing a little. And as we know, productivity is money in your pocket.


Typically if you’re a developer in a Wpf project you’ll find yourself in one of two roles.

  • Designer supporter
  • The Devigner (Designoper?)

The Designer Supporter

As the developer tasked to work along side designers you may think that because you have people who are tasked with doing all of the UI work you don’t need to know anything about Xaml or Blend. Or even worse, you may know too little but just enough to screw up the workflow for the designers. Knowing a little bit about Expression Blend and how to effectively support your Designers will be critical to your success. Here is a quick list of things you should learn about.

  • DataBinding
    • Binding class
    • IValueConverter
    • ObjectDataProvider
    • Design time data
  • ViewModels
    • INotifyPropertyChanged
    • INotifyCollectionChanged
  • Markup Extensions
  • Behaviors
  • Custom Controls
    • Visual State Manager
    • DependencyObject
    • DependencyProperty
    • Attached properties
  • Theming
    • Generic.xaml
    • [assembly:ThemeInfo]
  • Commands
  • Routed Events

As a developer supporting designers you will find yourself either having to DataBind your ViewModels to an existing UI created by the designers or you will have to create ViewModels that the designers can then build their UI around. Most likely the designers will want the former while developers will want the latter. All I can say is that the sooner you bring them both together and make it real the better off you will be.

The above topics will be crucial to know in order to preserve the developer / designer workflow with minimal friction.

The Devigner

You may actually be artistically inclined enough to claim this title yourself but then again maybe you’re simply tasked with this out of necessity. For developers who find themselves plunging into the world of Expression Blend here is a list of tasks, in addition to the previous list, that you may want to start learning about.

  • Templates
    • ControlTemplate
    • DataTemplate
    • DataTrigger
    • TemplateBinding
  • Visual State Manager
  • Storyboards
  • Commands
  • Styles
    • Setters
    • Triggers
  • Resources
    • Merged Dictionaries
  • Gradients
  • Layout Controls
    • Grid
    • StackPanel
    • ListBox
    • DockPanel
    • Canvas
  • Layout Transform vs. Render Transform

And really the list goes on and on. There is a myriad of things to learn but this is a pretty good start.

The important thing to know here though is that Expression Blend is not just a tool for Designers. It’s simply a tool geared towards creating rich interactive applications. Designers and developers alike will benefit greatly from this tool in any Wpf project, whether you’re just hooking up DataBinding for designers or doing it all yourself.

I would like to break down each of these topics in following blog posts with a developer oriented slant and how they can be done in Blend, so keep an eye out for those!

Turn on Code Analysis Early

Both at work and my most recent project MetaSharp, I have been working on retroactively hooking up StyleCop to an existing code base. Let me just say that from now on this will be one of the first things I do when I start a new project! It’s sooo much easier to simply fix up new code as you write it than it is to go back over many files and fix literally hundreds of errors all at once.

It’s also important to hook up StyleCop and FxCop or whatever code analysis tool you are using to your build process. Don’t let it be an optional manual step, get it integrated right in with your build project files so you can get those error messages right away.

I’m happy to say that MetaSharp is now fully compliant with FxCop and StyleCop with almost all of the rules turned on. I had to turn down the documentation rules a little bit and the File Header rules. Anything that goes in a file header that is unique to that file defeats the purpose in my opinion.

The moral of the story is that if you don’t turn it on right away its likely to be too much of a pain to turn it on ever! And it really does add a lot of value to the code, I believe.

MetaSharp Vision for the Future

I was just having some ideas and wanted to put it down somewhere partly for myself and partly to get some feedback.

One of the next things I want to do is to convert the compile-to-CodeDom parts of MetaSharp into a Vistor Pattern so that I can use the same system to compile to CodeDom or Generate MetaSharp or to transform the AST or whatever I want. This will bring a lot of flexibility and power to the whole system.

I was thinking about a post by Ayende Rahien the other day called M is to DSL as Drag and Drop is to Programming and specifically I was thinking about the quote “If you want to show me a DSL, show me one that has logic, not one that is a glorified serialization format.“ And what I took this to mean is that there is no logic in this DSL. Which can still be declarative but will often time have concepts like less-than or greater-than or equal-to. Certainly not limited by this but these are fairly common. To me his complaint (which is valid) is that with an external DSL, no matter how easy it is to write a grammar, it is still hard to expression logic with a grammar, and furthermore it is just as hard to translate that logic into something executable.

With an internal DSL, such as you get with Boo, you can easily just author keywords for your DSL but you get all of the logical operators for free, which is very nice of Boo. But unfortunately with an internal DSL you not only get the logical operators for free you are forced to get them. With an internal DSL you can do less work to get it working but you are not operating in a constrained universe. This has trades offs but lets certainly not dismiss it. There plenty of use cases where this is the preferred way of doing it.

However there are some distinct benefits of an external DSL, the major tradeoff being the effort required to implement it. The main benefit is that you can constrain your universe such that only allowable logic can happen in the correct spots. It’s like a sandboxed language, which I like to call a constrained universe. And believe it or not constraint can actually be freeing.

So my sudden flash of insight this morning was when I realized that actually, with MGrammar, you can choose to import grammars defined in other assemblies and use the syntax and tokens defined there. So when you choose to use MetaSharp by adding a reference to the assembly you can actually also import the MetaSharp.Lang grammar and easily make use of the BinaryExpression syntax in your own DSL (or anything else). Then I was also thinking that you could probably make use of the same AST serialization tools and (soon to be) AST transformation Visitors to build your own DSLs without a lot of the extra work. Using that type of system you could probably transform directly into executable code completely without using the templating at all, haha! Simply transform your custom AST nodes into standard supported Nodes, or write your own visitor that can handle your custom nodes. Your custom visitor could probably also tap into the templating system so you could write the AST transformation as a MetaSharp template if you desired as well.

This would put MetaSharp into the role of being an extensible compiler system where custom external DSLs can opt-in to standard language grammar where appropriate rather than not even being able to opt-out as in current internal DSLs. This is powerful idea and I think it is well within my grasp.