Silverlight 4 Runs Natively in .NET

This is a big deal.

I just learned this the other day and found surprisingly little information about it online. It was announced at PDC so it’s not a secret but it seems like a big deal to me. The feature is otherwise known as “assembly portability”.

When doing Silverlight one of the most frustrating things currently is the inability to run unit tests out of browser. This results in a break in continuous integration and a frustrating manual step in your testing process. Well no more!

This is only the beginning of the implications however. If you’re writing any application with a business logical layer, you should probably be writing it in Silverlight exclusively now. No more duplicating projects and creating linked file references. You can literally just create one project, compile it, and run it in both runtimes. Incredible.

Of course there are some limitations. But in this case I almost feel like the limitations are actually benefits. The thing is, you are likely to have features in one environment that are different or inaccessible in another. For example, file system access. In .net if you’re easily accessing the file system, but you might not be able to access it directly in Silverlight in the same way.

Enter System.ComponentModel.Composition. Otherwise known as MEF. By making your application logic composable you can solve all of the problems of framework differences and make your project imminently unit test friendly and better in general (if you buy into the principals of IoC at least).

For example, in Silverlight you cannot get a FileStream directly, you must make a call to OpenFileDialog which will give you the FileStream, if a user allows it. This is all well and good but when running in .net or unit tests you may want to allow it to access the file system directly or give it a mock stream instead. The solution is to make your calls to retrieve streams composable (otherwise known as dependency injection). Create a service interface, and create service instances for different environments.

For example, suppose you have the following (contrived) method in a Silverlight class:

public void DoWork()
{
    var dto = new SimpleDTO { Id = 100, Foo = "Hello World!" };

    Stream stream = Open();
    Save(stream, dto);

    stream = Open();
    dto = Load<SimpleDTO>(stream);

    Console.WriteLine("{0} : {1}", dto.Id, dto.Foo);
}

The process of opening a stream in Silverlight is different from the way it must be done when running in .net. So we make it composable.

public Stream Open()
{
    var stateService = container.GetExport<IStateService>().Value;
    stateService.State.Seek(0, SeekOrigin.Begin);
    return stateService.State;
}

Instead of opening the stream ourselves we import an exported service via MEF, that does know how to do it. To do this we simply need to have access to a container.

private CompositionContainer container;

public ComposableObject(CompositionContainer container)
{
    this.container = container;
}

Our constructor accepts a CompositionContainer as a parameter, which gives us access to all of the composable parts configured for our runtime. Keep in mind this is all Silverlight code at this point. And here is the IStateService.

public interface IStateService : IDisposable
{
    Stream State { get; }
}

The following code snippets are plain-old-dot-net-console-application snippets. First off, here is a snapshot of my solution explorer so you can see how things are structured.

image

You can see that I have created a reference from a Console Application project directly to a Silverlight 4 Class Library project. Visual Studio gives me a yellow banger, presumably because of the framework differences but it builds just fine. My program loads and calls my Silverlight library just this easily:

using SilverlightClassLibrary1;
class Program
{
    static void Main(string[] args)
    {
        var assembly = new AssemblyCatalog(typeof(Program).Assembly);
        using (var container = new CompositionContainer(assembly))
        {
            var co = new ComposableObject(container);
            co.DoWork();
        }

        Console.ReadKey(true);
    }
}

The bits at the beginning are creating a CompositionContainer using the current assembly. MEF allows you to load containers from all sorts of sources however, including entire directores full of Assemblies so you can have an easy add-in system. The ComposableObject is the one defined in my Silverlight Assembly! No interop nastiness, no AppDomain hassles, it just loads the dll as if it were true .NET code!

Next all I have to do is create an instance of the IStateService and export it.

[Export(typeof(IStateService))]
public class ParallelProcessingService : IStateService
{
    private Stream state;

    public Stream State
    {
        get
        {
            if (state == null)
                state = File.Create("state.dat");
            return state;
        }
    }

    public void Dispose()
    {
        if (state != null)
            state.Dispose();
    }
}

Now when I run this application, my Silverlight code will use MEF to load an Exported IStateService instance for me. Running this code will the access the FileSystem directly even though I’m running a Silverlight class library.

So what you should do is to create a Class Library with all of your logic, composed in a similar fashion as the above. Then in your Silverlight Application you simply implement and Export all of the Silverlight specific code as services. You do the same for your unit testing in .net projects and you’ll be able to run the exact same assembly in both locations.

The bonus to this, of course, is that you’ll also be able to swap out logic that you want to actually be different in different locations as well. For example, if you’re creating a business application you could put all of your business logic into a single assembly that could be run on both the client and the server. However, what that logic does and how it does it might be different in both locations. You may need to do a server call to a database to determine if a particular value of your business object is unique. On the client you want to make an asynchronous web request back to the server but on the server you want to make a call directly to the Database. It’s the same object and assembly in both locations so in order to achieve this you need to make the ValidateUnique rule itself composable, then this is possible even though the object is simply applying the rule in the same way.

In fact this technique can be very pervasive and powerful in general. Running on multiple frameworks requires you to be composable, which may also inadvertently force you into some good practices in general.

One other thing to note. I had to set CopyLocal=True for some of my references in the Silverlight Class Library to get it to run correctly in .NET. Since those assemblies aren’t in the GAC by default, it won’t load them unless they tag along with your assembly.

image

I didn’t test this out myself but you wouldn’t want those files appearing in your .xap file for your Silverlight application. I’m pretty sure that it would be smart enough to exclude them but double check.

Turn on Code Analysis Early

Both at work and my most recent project MetaSharp, I have been working on retroactively hooking up StyleCop to an existing code base. Let me just say that from now on this will be one of the first things I do when I start a new project! It’s sooo much easier to simply fix up new code as you write it than it is to go back over many files and fix literally hundreds of errors all at once.

It’s also important to hook up StyleCop and FxCop or whatever code analysis tool you are using to your build process. Don’t let it be an optional manual step, get it integrated right in with your build project files so you can get those error messages right away.

I’m happy to say that MetaSharp is now fully compliant with FxCop and StyleCop with almost all of the rules turned on. I had to turn down the documentation rules a little bit and the File Header rules. Anything that goes in a file header that is unique to that file defeats the purpose in my opinion.

The moral of the story is that if you don’t turn it on right away its likely to be too much of a pain to turn it on ever! And it really does add a lot of value to the code, I believe.

MetaSharp code generation success!

I finally have an end to end example of code generation with MGrammar I’m happy with. It’s pretty simple too I think. What I have working right now is the scenario where you author a grammar for use as an external DSL and would like to use it in existing applications. You are able to add templates to any .NET project and generate code in the native language. Which means the same template will work for C#, VB, F#, Boo or whatever projects since it compiles down to the CodeDom.

In addition to your MGrammar language definition you can now create a template in MetaSharp, there is a MSBuild task that compiles that template file into a Template class in the native language of your project. That template class knows how to generate code that binds to a model (MGraph objects or CLR objects). Here is a visual example.

Here is an example of a template in MetaSharp.

namespace Samples.Song.Templates:
    import System;
    import System.Collections.Generic;
    import System.Threading;
    import System.Text;
    
    public class {Song.Name} as Song:
        public constructor:
            @for(b in {Song.Bars as enumerable}):
                super.Bars.Add(new Bar(
                    "{b.Note1}",
                    "{b.Note2}",
                    "{b.Note3}",
                    "{b.Note4}"));
            @end
        end
    end
end

Its a very simple language, the interesting parts are the ‘@’ characters and the {…} groups. The @ symbol puts that line into template mode. Meaning that will be a literal line in the template. The curly bracket groups are Binding statements, similar to what you see in Xaml only much less expressive at this point. So far you just specify a path and an optional type to cast it to. “enumerable” is just a helper to cast it to IEnumerable, or in the case if MGraph to get the sequence for the node.

So if you take the song DSL file from the Song Sample included in the Oslo SDK.

Song Fun
- - - D
C C# F G
E E - D
A E - E
G F - E
D C D E
A E D D
D E A C

And apply that to the template above you end up with this code (in C# in this case):

//------------------------------------------------------------------------------
// <auto-generated>
//     This code was generated by a tool.
//     Runtime Version:2.0.50727.3521
//
//     Changes to this file may cause incorrect behavior and will be lost if
//     the code is regenerated.
// </auto-generated>
//------------------------------------------------------------------------------

namespace Samples.Song.Templates
{
    using System;
    using System.Collections.Generic;
    using System.Threading;
    using System.Text;


    public class Fun : Song
    {

        public Fun()
        {
            base.Bars.Add(new Bar("-", "-", "-", "D"));
            base.Bars.Add(new Bar("C", "C#", "F", "G"));
            base.Bars.Add(new Bar("E", "E", "-", "D"));
            base.Bars.Add(new Bar("A", "E", "-", "E"));
            base.Bars.Add(new Bar("G", "F", "-", "E"));
            base.Bars.Add(new Bar("D", "C", "D", "E"));
            base.Bars.Add(new Bar("A", "E", "D", "D"));
            base.Bars.Add(new Bar("D", "E", "A", "C"));
        }
    }
}

So this is really cool! This will allow you to create MGrammar DSLs without having to write complicated code to consume the object graph. And it all happens at build time! I’m planning on doing a little more cleanup then probably creating a CodePlex project for this. This is exactly what I wanted for NBusiness… hopefully I can get back to that soon, haha!

MetaSharp – A CodeDom based Template Engine using MGrammar

I’ve been working on a tangential project related to NBusiness for a couple of weeks now and I just wanted to take a moment to get a few of my thoughts out. The project I have been working on I am tentatively calling “MetaSharp” for now. It’s been fun and educational but hopefully it will have real usefullness when it is done. I wanted to have a fully working example before I publicly posted the code (since it’s basically prototype quality right now) but if anyone is interested in seeing what I have so far feel free to ask and I’ll hook you up somehow.

I’ll try to start at the beginning to justify my rationale for creating this strange project. I’ve been working on NBusiness for quite a while now and while I’ve really had NBusiness “working” almost all along I have never quite been able to get it where I want it to be (complete). If I had to sum up the entire process of working on NBusiness into one sentence it would be “creating a DSL is hard”. That’s an understatement frankly. Let me see if I can lay out the various layers required for DSL creation.
·         Domain Objects
·         Parser
·         Compiler
·         Template Engine
·         Build Integration
·         Tooling Support
The first three items are actually relatively easy and pretty fun. This is what we all know how to do, write code to parse strings and stick values into objects. No problem. It turns out the next three layers which really provide the fit, finish and ultimate usability of your DSL are not easy at all. Build integration isn’t really that bad actually but tooling integration can be a real bear. In the case of a DSL you really want syntax hilighting and intellisense and nice IDE integration for file templates and things like that. Maybe a few additional context menus in your IDE and such. For me I have been trying to integrate into Visual Studio and I can officially say that I have sunk well over half my time into that aspect alone and it has been one of the hardest things I have ever tried to do. Visual Studio is also architected such that I had to completely redo my parser and compiler to be compatible with the needs of Visual Studio. Very painful.
But what is really hanging me up now is what I consider to be a large gap in the .NET  DSL world and that is a suitable templating engine. By templating engine I mean something that can take metadata and translate it into code.
I mean we have a bunch out there but they’re all (as far as I know) effectively giant string builders. They suffer from Tag Soup and and are bound strongly to a specific language implementation. For NBusiness I want to support side by side integration with any .NET language, C# or VB or Python or whatever. And re-creating all of these templates for every language is not an option. It’s too much upfront work and it’s too much long term maintenance. I absolutely need templates that are based on the CodeDom so I can be language agnostic… But if you’ve ever tried to use the CodeDom you know how hard it is to work with. Because of this users are very unlikely to actually make their own templates (which is almost always necessary) and when they do it is a very painful process. So I’ve been stuck in this cunundrum for quite a while, how can you make a template engine that is both based on the CodeDom but has the ease of use of a string builder?
Enter MGrammar. Using MGrammar I have found a way to define a DSL for generating code. This DSL turns out to be a full fledged programming language in and of itself with the caveat of being restricted only to that which is CLS compliant. I have combined this DSL with the capability to create templates (to extend the language, similar to macros in Boo) and databinding similar to what you have in XAML. The end result allows you to do something similar to this:
namespace Example:
    import System;
 
    template One:
        public class {Binding Name}:
            {SequenceBinding Items, Template=Two}
        end
    end
 
    template Two:
        private field {Binding Type} _{Binding Name};
        public property {Binding Type} {Binding Name}:
            get:
                return this._{Binding Name};
            end
            set:
                this._{Binding Name} = value;
            end
    end
end
(This is just an example, the end result might not actually be exactly this syntax)
Which when compiled will generate a class called OneTemplate that inherits from Template and returns a CodeTypeDeclaration object from it’s Generate method. Extensions such as the BindingExtension show here can be custom objects to extend behaviors but in this case it binds the name of the class to the Name property (or Name sequence node of an MGraph tree) of the provided metadata.
Technically you could write your entire project in pure MetaSharp code but more likely you will write all of your static classes in your rich language of choice and simply use MetaSharp to define templates. Since this is all compiling down to CodeDom objects I have cooked up some MSBuild tasks that simply translate those objects into the code of the project the files exist in. You could share this same file in a VB or C# project and it would compile to the same thing in both assemblies.
Currently I am working on a prototype using the Song example from the MGrammar sample code that will allow you to write songs that generate song classes using templates like these. It’s almost working… the CSharpCodeProvider is throwing a random NullReferenceException with no useful error messages. Which is one reason why a DSL like this is helpful, it should be able to abstract away the pain of working directly with the CodeDom.

Getting a CodeDomProvider in an MSBuild Task

Trying to get the correct CodeDomProvider inside of an MSBuild task wasn’t as easy as I would have liked. Well actually the code itself is pretty simple but finding out how to actually do it was difficult. There doesn’t appear to be a profusion of people doing such a thing so the blogosphere and forums are fairly sparse with related information. After messing around with it for a couple of hours I think I finally found a pretty reliable (aka it doesn’t feel like a dirty hack) way to do it.

The two key bits of information is the <Language /> PropertyItem that each language defines in its own common targets file (by convention) and the CodeDomProvider.GetAllCompilerInfo() method. Here is my targets file.
 
<!–Reference the assembly where our tasks are defined–>
<UsingTask TaskName=“MetaSharp.MSBuild.TemplateTask” AssemblyFile=“$(MSBuildExtensionsPath)\MetaSharp\MetaSharp.MSBuild.dll” />
 
<!–Compile target (this is the target that calls the compiler task)–>
<Target Name=“BeforeBuild”>
    <Message Text=“Building: @(MetaSharpTemplate)” />
    <TemplateTask Templates=“@(MetaSharpTemplate)” Language=“$(Language)”>
      <Output TaskParameter=“Generated” ItemName=“Compile” />
    </TemplateTask>
</Target>
</Project>
The key here is the Language=”$(Language)” property on the task. And here is my task with a Linq statement to find the correct CodeDomProvider.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Microsoft.Build.Utilities;
using Microsoft.Build.Framework;
using System.Collections;
using System.CodeDom;
using System.CodeDom.Compiler;
 
namespace MetaSharp.MSBuild
{
      publicclassTemplateTask : Task
      {
            // Properties
            [Required]
            publicITaskItem[] Templates { get; set; }
 
            [Required]
            publicITaskItem Language { get; set; }
 
            [Output]
            publicITaskItem[] Generated { get; set; }
 
            publicoverridebool Execute()
            {
                  base.Log.LogMessage(0, “Building MetaSharp templates.”);
 
                  if (this.Language == null || string.IsNullOrEmpty(this.Language.ItemSpec))
                  {
                        base.Log.LogError(“You must have a Language PropertyItem defined somewhere in your project files to specify which CodeDomProvider to use (i.e <Language>C#</Language>)”);
                        returnfalse;
                  }
 
                  CodeDomProvider provider = FindProvider(this.Language.ItemSpec);
 
                  returntrue;
            }
 
            privateCodeDomProvider FindProvider(string language)
            {
                  CodeDomProvider[] providers = (from info inCodeDomProvider.GetAllCompilerInfo()
                                                               from l in info.GetLanguages()
                                                               where l.ToUpperInvariant() == language.ToUpperInvariant()
                                                               selectCodeDomProvider.CreateProvider(l))
                                                               .ToArray();
                 
                  CodeDomProvider provider = null;
                  if (providers.Length == 0)
                  {
                        Log.LogError(“Unable to find a valid CodeDomProvider for this project type. Try adding a valid Language property item to your msbuild project file”);
                  }
                  elseif (providers.Length > 1)
                  {
                        // It would be surprising if this ever happened…
                        Log.LogError(“Found multiple valid CodeDomProviders for this Language type. Try adding a less ambiguous Language property item to your msbuild project file”);
                  }
                  else provider = providers[0];
 
                  return provider;
            }
      }
}
For those of you who are curious MetaSharp is the tentative name of the CodeDom DSL I have been working on using MGrammar. It’s coming along pretty well, this task will build MetaSharp code files to be compiled along with the project they are contained in. There are lots of details to be shored up but the basic use cases are working right now. When I have things a little more polished I will probably create another post with some samples.