DSLs in Boo – Tech Review

The book DSLs in Boo: Domain-Specific Languages in .NET written by Oren Eini writing as Ayende Rahien has finally shipped. The final version is available online at Manning Press and I highly recommend it.

I was one of the technical reviewers so I have already read this book and I can tell you first hand it’s definitely one for your book shelf. The first few chapters describe the generalities of DSLs and Boo then the subsequent chapters describe all of the details you might need to know to implement an internal DSL in your application.

If you’ve been wondering about the whole DSL thing, this book would be a good gateway to the world. It’ll stretch your brain and certainly expose you to some new ideas. In addition the author makes a good case for some practical business applications and introduces a new addition to the open source scene: Rhino DSL.

I also got an honorable mention in the acknowledgments section of the book and a special quote on the Manning Press web site that’s pretty cool! Check out it out!

Pattern Calculus: Computing with Functions and Structures

I haven’t read this book yet but I am putting it to the top of my queue after reading the first couple parts of chapter 1:

http://books.google.com/books?id=Q_J4Lnmfjx4C&lpg=PP1&dq=pattern%20calculus&pg=PP1#v=onepage&q=&f=false

 

Here is the abstract:

This book develops a new programming style, based on pattern matching, from pure calculus to typed calculus to programming language. It can be viewed as a sober technical development whose worth will be assessed in time by the programming community. However, it actually makes a far grander claim, that the pattern-matching style subsumes the other main styles within it. This is possible because it is the first to fully resolve the tension between functions and data structures that has limited expressive power till now. This introduction lays out the general argument, and then surveys the contents of the book, at the level of the parts, chapters and results.

*emphasis mine

Based on my understanding of the initial claims it describes the core concepts that OMeta is also based on. I’m not sure of Alessandro Warth has read this book or if this is parallel research but they seem to both come to the same conclusions. It never ceases to amaze me how concepts that seem so new and fresh to me have almost always already been written about in a 200 page book.

Read-Only Object Initialization In C# (and other missues of Lamba expressions)

I saw a blog post the other day that explained the core concept of what I’m about to show you, so I can’t claim credit. Unfortunately I can’t find the post now so you’ll have to do some searching if you want to find it for yourself.

http://cid-dfcd2d88d3fe101c.skydrive.live.com/embedrowdetail.aspx/blog/justnbusiness/ExpressionToDictionary.cs

 

This is just one possible usage of a trick that I had not seen before in C#. In this case I want to, in a very simple way, solve the problem of using the oh so handy object initialization syntax without sacrificing my object model and making all of my properties settable.

Here is an example of object initialization in C#:

var model = new Model { Id = 1, Name = Justin };

This is a nice way of being able to create a new instance of an object in C# 3 or above. The only problem is that it requires accessible Setters for the properties specified and that sometimes conflicts with the object model of the Type you are creating. The alternative is to create the right constructors that sets the fields backing the properties like usual but you lose a lot of the fun new syntax.

So my hack here shows you how you can use Lambda expressions to get almost the same syntax while still keeping your objects properties read-only.

 

var model = new Model(Id => 1, Name => "justin");

Instead of using the object initialization syntax you create a constructor that accepts an array of expressions. You can create lambdas that look very similar and set them to your backing fields.

 

public class Model
{
    private IDictionary<string, object> values;

    public Model(params Expression<Func<string, object>>[] parameters)
    {
        this.values = parameters.ToDictionary();
    }

    public int Id { get { return (int)this.values["Id"]; } }

    public string Name { get { return (string)this.values["Name"]; } }
}

 

There are probably better ways of doing this but this but hopefully you get the idea. You could also have your model implement DynamicObject and redirect the TryGetMember method to your dictionary. Like this:

class Program
{
    static void Main(string[] args)
    {
        dynamic model = new Model(Id => 1, Name => "justin");

        Console.WriteLine("Id: {0}, Name: {1}", model.Id, model.Name);
    }
}

public class Model : DynamicObject
{
    private IDictionary<string, object> values;

    public Model(params Expression<Func<string, object>>[] parameters)
    {
        this.values = parameters.ToDictionary();
    }

    public override bool TryGetMember(GetMemberBinder binder, out object result)
    {
        if (this.values.ContainsKey(binder.Name))
        {
            result = this.values[binder.Name];
            return true;
        }

        return base.TryGetMember(binder, out result);
    }
}

Actually it turns out that this little trick is handy for a lot of scenarios. Essentially what we’re doing here is creating a dictionary out of the expressions by using the name of the parameter on the left hand side of the lambda expression. Dictionaries as the last parameter is one trick Ruby uses to enable a lot of the DSLs written for Ruby, for example.

Let your mind run wild.

 

Here is the magic ToDictionaryMethod:

static class Extensions
{
    public static IDictionary<string, object> ToDictionary(this Expression<Func<string, object>>[] parameters)
    {
        LazyDictionary<string, object> dictionary = new LazyDictionary<string, object>();
        foreach (var expression in parameters)
        {
            string key = expression.Parameters.Single().Name;
            var value = expression.Compile();

            dictionary.Add(key, value);
        }

        return dictionary;
    }

    private class LazyDictionary<T, V> : IDictionary<T, V>
    {
        private Dictionary<T, Func<T, V>> innerDictionary = new Dictionary<T, Func<T, V>>();

        public void Add(T key, Func<T, V> value)
        {
            this.innerDictionary.Add(key, value);
        }

        void IDictionary<T, V>.Add(T key, V value)
        {
            this.innerDictionary.Add(key, v => value);
        }

        bool IDictionary<T, V>.ContainsKey(T key)
        {
            return this.innerDictionary.ContainsKey(key);
        }

        ICollection<T> IDictionary<T, V>.Keys
        {
            get { return this.innerDictionary.Keys; }
        }

        bool IDictionary<T, V>.Remove(T key)
        {
            return this.innerDictionary.Remove(key);
        }

        bool IDictionary<T, V>.TryGetValue(T key, out V value)
        {
            value = default(V);
            Func<T, V> func = null;

            bool exists = this.innerDictionary.TryGetValue(key, out func);
            if (exists)
            {
                value = func(default(T));
            }

            return exists;
        }

        ICollection<V> IDictionary<T, V>.Values
        {
            get
            {
                return (from v in this.innerDictionary.Values
                        select v(default(T)))
                        .ToList();
            }
        }

        V IDictionary<T, V>.this[T key]
        {
            get
            {
                return this.innerDictionary[key](default(T));
            }
            set
            {
                this.innerDictionary[key] = v => value;
            }
        }

        void ICollection<KeyValuePair<T, V>>.Add(KeyValuePair<T, V> item)
        {
            ((ICollection<KeyValuePair<T, Func<T, V>>>)this.innerDictionary).Add(
                new KeyValuePair<T, Func<T, V>>(item.Key, v => item.Value));
        }

        void ICollection<KeyValuePair<T, V>>.Clear()
        {
            this.innerDictionary.Clear();
        }

        bool ICollection<KeyValuePair<T, V>>.Contains(KeyValuePair<T, V> item)
        {
            return this.innerDictionary.ContainsKey(item.Key);
        }

        void ICollection<KeyValuePair<T, V>>.CopyTo(KeyValuePair<T, V>[] array, int arrayIndex)
        {
            throw new NotImplementedException();
        }

        int ICollection<KeyValuePair<T, V>>.Count
        {
            get { return this.innerDictionary.Count; }
        }

        bool ICollection<KeyValuePair<T, V>>.IsReadOnly
        {
            get { return ((ICollection<KeyValuePair<T, Func<T, V>>>)this.innerDictionary).IsReadOnly; }
        }

        bool ICollection<KeyValuePair<T, V>>.Remove(KeyValuePair<T, V> item)
        {
            return this.innerDictionary.Remove(item.Key);
        }

        IEnumerator<KeyValuePair<T, V>> IEnumerable<KeyValuePair<T, V>>.GetEnumerator()
        {
            return (from pair in ((ICollection<KeyValuePair<T, Func<T, V>>>)this.innerDictionary)
                    select new KeyValuePair<T, V>(pair.Key, pair.Value(default(T))))
                   .GetEnumerator();
        }

        System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator()
        {
            return (from pair in ((ICollection<KeyValuePair<T, Func<T, V>>>)this.innerDictionary)
                    select new KeyValuePair<T, V>(pair.Key, pair.Value(default(T))))
                   .GetEnumerator();
        }
    }
}

Ship It!

phone 033

 

Because my phone sucks so bad it’s a tiny picture but this is my first Ship It award from Microsoft for being apart of the team that shipped Expression Studio 3. Also, even though it looks like copper its actually a metallic silver. The inscriptions says:

Ship It

Every time a product ships, it takes us one step closer to the vision: empower people through great software-any time, any place and on any device. Thanks for the lasting contribution you have made to Microsoft History.

Steve Ballmer    Bill Gates

Justin Chase

 

Thanks Steve and Bill!

OMeta

If you haven’t heard of it check this out:

http://www.tinlizzie.org/ometa/

I was just reading his Ph.D. dissertation linked on that page and this quote is really jumping out at me:

OMeta’s key insight is the realization that all of the passes in a traditional compiler are essentially pattern matching operations:
• a lexical analyzer finds patterns in a stream of characters to produce a stream of tokens;
• a parser matches a stream of tokens against a grammar (which itself is a collection of productions, or patterns) to produce abstract syntax trees (ASTs);
• a typechecker pattern-matches on ASTs to produce ASTs annotated with types;
• more generally, visitors pattern-match on ASTs to produce other ASTs;
• finally, a (naive) code generator pattern-matches on ASTs to produce code.

He is so right. It seems that pattern matching might be the other side of the coin of Transformation. Great stuff.