2008 and two years of blogging

It’s nearly time to wrap up. At the moment I’m finishing a White paper on some SaaS integration and within an hour, I’m going to leave the office for an evening with friends.

Up to 2008! Yes 2008… A new year for some great opportunities. There’s so much to explore! 2008 will be the year for me for lots of new challenges, on and offline. For example, I’ve to finish my study, changes of the web, etc, etc.
And as some of the developers have noticed on SPN, Sitecore will release a new version of the product. Hopefully it becomes as lot of fun to explore as 5.3 was. 🙂

Their are some ambitions who finally become reality and in 2008 I’m finally going to release my first Sitecore Developer toolkit ;). Yes that’s a promise :D. You’ll hear a lot more about all my plans for 2008 very soon, I’m sure you’ll like it…

Well and a couple of weeks ago, I’ve reached another milestone in blogging: 2 years. And, I’m still active… :). Sometimes a little less and sometimes, like today, a little more! It look a little bit like a sinus.

Yesterday I’ve upgraded my blog to v2.3.2, so I’m totally 2008 ready ;). Some stats for the fans:

  • 212 posts
  • 68.956 spam comments(and counting…)
  • 5000+ unique visitors per week
  • 800 megabyte of traffic monthly

I think it’s time to place some ads on my blog…

Last but not least, for those who are interesting and are familiar to the Dutch language, check LECTRIC’s new website(our department). Good work, team Belarus and Sjoerd(-s)!!

There’s nothing else to say then: have a great evening, and hopefully I’ll see you again in 2008(on my blog or in real life!).

Favicon.ico, why include it in the default dists?

One of the things I personally hate about Sitecore is that the favicon.ico is always included in the standard distributions. At LECTRIC we’ve decided to remove the favicon.ico by default at the start or deployment of a project(although sometimes one or two slip trough). So we’ll never have to answer the customer question why the Sitecore logo appears on their website(=navigation bar).

My question… Why do you include it anyway? Is it that interesting for marketing? More vendors do the same trick, but why?
Some random thoughts the last morning of 2008…

A Sitecore success in the States

A while ago, I wrote about an American university who is implementing Sitecore. As you can read on their blog and see on their website, are they on the right way!

I pretty like the design they’ve used. It does have the serious focus, but it’s never old fashioned like you’re used to when you look at university websites…

For all those who’ve to deal with large project such as these(over 220 applications/sites to migrate), I would recommend to read their blog once in a wile. They aren’t updating it as frequent as I would like. But every time they provide very interesting information about the whole process…

Do you think the market is stupid?

Well some vendors do. Please read the following quote copied from the press release of Tridion v5.3:

Modular Templating truly separates the design, code and applications of templates into template building blocks. Together with native integration with market-leading design and development tools, (including Adobe Dreamweaver and Microsoft Visual Studio .NET), Modular Templating gives designers, developers and programmers much greater control over each aspect of their template design and code.  By combining and reusing template building blocks, Modular Templates can be assembled for a variety of uses.

I’m sorry, are you kidding? I do know over 20 CMS’s with the clear separation in their templates.
This might look like a direct attack on Tridion. Well it isn’t, Tridion is very nice product which is very often ahead of lots of other vendors, but please, don’t sell this kind of bullshit to your customers. Some day they will use it against you…

I’m nuts

Yes it’s true. Today I found out that I really mess up my laptop with way to much information.
Well let’s start over. Since last Friday I’ve been a little ill. Something is going on in my stomage, and really don’t know what it is. So between visiting the toilet, drinking lots of water(seems to be very good for your) and sleeping, I’ve got way to much time to think.

Now and then, I grap my notebook and site down to surf a bit around, prepare some stuff for the upcoming week, etc. And while being online, a lot of information is passing by. Of course I’ve got a feed scrobbler with over 100 subscriptions. I’m on 30 social networks, receive over 30 newsletters daily, scrobble my last downloaded music to Last.fm, etc, etc. I’m the Information Worker like Microsoft would love to use as a case. But I’m also a total n00b when it comes to efficient work. For example, I’ve got 3 unbelievable huge folders on my desktop: ‘Downloads’, ‘On the read queue’ and ‘Screenshots’. None of them are often cleaned up. And even worst, I use my D:-root as a movie store…

Yesterday afternoon, a guy with a similar name, told me that there’s a Silverlight challenge in Europe. And after saving another 5 documents in my ‘On my read queue’-folder. I decided I need something like del.icio.us, but then with the possibility to note if I’ve read the information and how I rate it. Now, 35 minutes later, I’ve created a del.icio.us account, I’ve written down some requirements, I’ve downloaded the December CTP of Expression Blend 2, installed Silverlight 1.1 tools for VS2008, designed the architecture of the app right on the wall, etc.
But actually I don’t want to write this application. I know that del.icio.us delivers feeds which make it able for me to show all my stuff above in the way I require it to be shown. But I just don’t want to write it. Why do I’ve to reinvent the wheel? And ofcourse, it’s a nice contest, but I just want this project out of the box. Without installing toolbars such as del.icio.us… Arghh, why it my notebook not always available and why always online not yet the standard in Europe?

Argghhh I’m getting tired of my own question. But it’s really amazing… We’re working on 2100 prepared programming languages, tools, etc, but we can’t even structure or even transfer our data in a self-defined way :(. Yeah I guess that’s the topic: structure your data in your self-defined way…

Anyway, I’m off to my bed, guess I’m just the only one who struggles with this…

PLINQ Community

One of the coolest things with new initiatives such as PLINQ is that they are extremely early adopted by developers to work and test with. On MSDN Forum a lot of interesting discussions started this weekend. When you’re interested in the way Microsoft handles the feedback, take a look at the following topics. And when you’ve got your own opinion, feel free to join :)!

Hopefully, Sitecore will have such an active community when it comes to features. In the case of developing a framework, I definitely believe in direct user feedback.

PLINQ to Sitecore: Compared with Sitecore Query and old fashioned Foreach

On Kim’s request, I’ve written some additional tests:

        private void PerformWorkOldFashioned()
            using (new SectionTimer("Old fashioned using the foreach and if statement", true))
                foreach (Item item in CurrentItem.Children)
                    if (item.Appearance.Sortorder > 10)
                        Response.Write(item.DisplayName + "<br />");
        private void PerformWorkSitecoreQuery()
            using (new SectionTimer("Sitecore Query", true))
                Item[] items = CurrentItem.Axes.SelectItems(".//item[@sortorder>10]");
                foreach (Item item in CurrentItem.Children)
                    Response.Write(item.DisplayName + "<br />");

You can see full test result in this Excel sheet. And I know, their are better ways for profiling, but this one was was setup in another 5 minutes. So that’s cool :).

The final result:

  • LINQ: 53,09 ms
  • PLINQ: 33,9 ms
  • Old fashioned: 37,06 ms
  • Sitecore Query:  59,23 ms

As you can see, PLINQ is the winner, but not as far as you might expect. I should do the same test on a larger dataset and using a performance optimized CTP later on. You can also see the impact of the predicate in the Sitecore Query. Damn! Like Kim mentioned before, Sitecore Queries with predicates are slow :(. Sorry, Jakob ;)!

I’ve also been running Code Matrix on the 4 methods(Maintenance index, Cyclomatic Complexity, Class Compling and Lines of Code

  • LINQ: 63 – 8 – 12 – 7
  • PLINQ: 57 – 8 – 11 – 13
  • Old fashioned: 68 – 5 – 8 – 5
  • Sitecore Query:  68 – 4 – 8 – 5

In this case, Sitecore Query might look like to be the winner, but I guess Visual Studio doesn’t include the complexity behind the query. Taking that in mind, the old fashioned way is the winner, but the result are minimal.

The result for PLINQ is generally caused by the delegate, which introduces a lot of additional lines of codes(whatever that may be ;)).

That’s it for today folks. Tomorrow more about using LINQ with Sitecore!

PLINQ to Sitecore

A lot of exiting weeks have passed the way. The introduction of VS2008 on MSDN, Silverlight 1.1 is renamed to 2.0 and… PLINQ appeared! Last friday, Runi spent some time on showing what LINQ can do for you in a Sitecore environment. As it is Runi’s second post in 6 months, so I don’t expect him to do some log-series. I won’t do that as well, but when possible, I definitely try to show the advantage of using LINQ with Sitecore.

But uhm, he Alex, a second ago you were talking about PLINQ and now about LINQ again? Indeed, today’s topic is PLINQ. That’s an extremely cool new library written by Microsoft with the name ‘Parallel Language Integrated Query’. This means that your queries will be running against all your cores and resources available in your machine!
Have you ever mentioned that you aren’t really using that second core of the so cool Duo core machine? Well, I can promise you, you aren’t! I’ve written the following test:

    public partial class Sample_layout : System.Web.UI.Page
        protected Item CurrentItem
            get { return Sitecore.Context.Item; }
        protected IEnumerable<Item> Children
            get { return CurrentItem.Children.Cast<Item>(); }

        protected void Page_Load(object sender, EventArgs e)

        private void ClearCache()

        private void PerformWorkParallel()
            using(new SectionTimer("Parallel running trough items", true))
                var res = from i in Children.AsParallel<Item>()
                          where i.Appearance.Sortorder > 10
                          orderby i.DisplayName
                          select i;

                Parallel.ForEach<Item>(res, delegate(Item i) { Response.Write(i.DisplayName + "<br />"); });

        private void PerformWorkSequential()
            using (new SectionTimer("Sequential running trough items", true))
                var res = from i in Children
                          where i.Appearance.Sortorder > 10
                          orderby i.DisplayName
                          select i;

                foreach (Item item in res)
                    Response.Write(item.DisplayName + "<br />");

What does it do? It makes sure that the cache is cleared and then it runs trough the children of the current item, select the ones which have a Sortorder which is greater then 10 and write the output the the Response-buffer.

The result is amazing. In all cases, the Parallel query is executed faster. Here’s a short list of the result in ms I’ve seen:

Sequential Parallel
960.3 * 40.5
57.1 32.3
49.3 35.8
55.1 24.6
53.1 47.1
47.7 28.4
49.2 33.1
52.0 33.9

* = First run, the underlaying dataproviders also had to initialize so this result might not be very useful.

To make sure that the results are not influenced by the order in which way the data is called, I’ve done the same with the parallel first:

Parallel Sequential
290* 59.7
32.1 48.6
32.5 51.5
33.4 53.5
30.3 57.2
31.7 53.9
30.5 51.6
33.0 51.4

Overall, minus the highest and the lowest score, the Parallel processing is done 20ms faster then the sequential(33ms against 53ms). An impressive result, don’t you think? Even more when you look at the initialization results and when you keep in mind that I’m working on a simplistic developer notebook. I expect the results on a quad core server with a bunch of data to be even more significant.

Well, that’s nearly it for now. This small test took me more time then expected.

For additional info about PLINQ, visit the Parallel computing center at MSDN. Some other resources:

  • Parallel Extensions to the .NET Framework Team Blog
  • MSDN Forum for Parallel Extensions to the .NET Framework
  • Parallel Extensions to the .NET Framework Connect site
  • And for those who are really interested the how and why’s about concurrency. Read ‘the free lunch is over’ by Microsoft’s C++ Architect, Herb Stutter. Also mention these two blogpost when you’re interested in the CTP as it describes the issues with the current CTP of PLINQ.