Mischief Managed-Aligning Blog with GitHub and NuGet

For about a year and a half I’ve been working on various open source projects published on GitHub and several of them have packages on NuGet. When I first set them up, I used the name “duovia” which was the name of a little S corp I use from time to time for corp-to-corp consulting projects. But the name was just causing confusion for anyone looking me up and checking out my open source projects.

So I set out to simplify things a bit all around the name “tylerjensen.” I hope This will make it easier for people to find my work and less confusing when they do.

Blog Facelift and BlogEngine.Net Upgrade

Two days ago a friend of mine pointed out that some of my posts displayed a related link to one or more pages on my blog that I had not actually authored. I don’t generally use the pages feature of BlogEngine.Net, so you can imagine my surprise to find that my blog had been hacked by someone trying to promote a cause. If any of you were offended by that content, I sincerely apologize.

I quickly removed the rogue pages and found that the most likely point of entrance was a vulnerability in the combination of Disqus and the version of BlogEngine.Net that I had been running. The upgrade was not terribly hard but it was a bit tricky. Several side effects of the upgrade included a number of broken links to older posts that used double escaped characters in their title and links. This required enabling requestFiltering with allowDoubleEscaping="true" in the web.config.

The upgrade also sports a far better theme structure and rather than take the time to migrate my custom theme, I decided to go with the existing standard theme with just one or two modifications. This includes the new blog logo, an homage to the company that made my first computer—Commodore.

And finally, of course, just to be sure it wasn’t a simple case of Javascript injection via a malicious comment, I changed all my passwords. I also deleted older non-Disqus comments and updated my Disqus settings and password. For now, I’ll keep the current theme. It suits me. And with all of that out of the way, I can get back to keeping this blog current with what I hope will be useful material.

ASP.NET vNext Update from Hanselman

I’ve started to watch with eager interest the work being done on ASP.NET vNext. Read this excellent status rollup from Scott Hanselman.

Here are my favorite items:

  • Runs on Windows, Mac, and Linux.
  • Runtime in-memory compilation with Roslyn compiler.
  • Cloud optimized CoreCLR installed locally (optional).
  • New project.json system--takes NuGet to infinity and beyond.

I don’t often parrot other blog posts, but you really need to read Hanselman. If you haven’t been paying attention to what the team is doing with ASP.NET, you can repent now and get on board.

Also check out David Fowler’s blog and the official team blog.

Merge Algorithm for Multiple Sorted IEnumerable<T> Sources

This evening I was asked to write a merge algorithm to efficiently merge multiple iterator sources, yielding a merged iterator that would not require the algorithm to read all of the data into memory should the sources be very large. I’ve never written such an algorithm nor can I recall seeing one, so I didn’t have a very good good answer. Of course that left a simmering thread of though on the back burner of my brain.

After letting it rattle around a bit and without resorting to old fashioned Googling, I sat down and banged out the following code. It was fun to write and works but it took me much too long to write from scratch—about 90 minutes. It may be time to refresh and reload, perhaps by writing a series of posts that implement C# versions of selected algorithms found in a book I recently purchased but have since spent no time reading: Introduction to Algorithms 3rd Edition.

Updated Code (9/6/2014)The original code gets a big performance boost with this refactoring:

public static IEnumerable<T> SortedMerge<T>
  (params IEnumerable<T>[] sortedSources)
  where T : IComparable
{
  if (sortedSources == null || sortedSources.Length == 0)
    throw new ArgumentNullException("sortedSources");

  //1. fetch enumerators for each sourc
  var enums = (from n in sortedSources
         select n.GetEnumerator()).ToArray();

  //2. create index list indicating what MoveNext returned for each enumerator
  var enumHasValue = new List<bool>(enums.Length);
  // MoveNext on all and initialize enumHasValue
  for (int i = 0; i < enums.Length; i++)
  {
    enumHasValue.Add(enums[i].MoveNext());
  }

  // if all false, nothing to iterate over
  if (enumHasValue.All(x => !x)) yield break;

  //3. loop through
  while (true)
  {
    //find index with lowest value
    var lowIdx = -1;
    T lowVal = default(T);
    for (int i = 0; i < enums.Length; i++)
    {
      if (enumHasValue[i])
      {
        // must get first before doing any compares
        if (lowIdx < 0 
		    || null == enums[i].Current //null sorts lowest
		    || enums[i].Current.CompareTo(lowVal) < 0)
        {
          lowIdx = i;
          lowVal = enums[i].Current;
        }
      }
    }

    //if none found, we're done
    if (lowIdx < 0) break;

    //get next value for enumerator chosen
    enumHasValue[lowIdx] = enums[lowIdx].MoveNext();

    //yield up the lowest value
    yield return lowVal;
  }
}

Here’s the original code. I hope you enjoy it. And if you see ways to improve on it, please let me know.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;

namespace Merger
{
  class Program
  {
    static void Main(string[] args)
    {
      int[] a = { 1, 3, 6, 102, 105, 230 };
      int[] b = { 101, 103, 112, 155, 231 };

      var mm = new MergeMania();

      foreach(var val in mm.Merge<int>(a, b))
      {
        Console.WriteLine(val);
      }
      Console.ReadLine();
    }
  }

  public class MergeMania
  {
    public IEnumerable<T> Merge<T>(params IEnumerable<T>[] sortedSources) 
      where T : IComparable
    {
      if (sortedSources == null || sortedSources.Length == 0) 
        throw new ArgumentNullException("sortedSources");
      
      //1. fetch enumerators for each sourc
      var enums = (from n in sortedSources 
             select n.GetEnumerator()).ToArray();
      
      //2. fetch enumerators that have at least one value
      var enumsWithValues = (from n in enums 
                   where n.MoveNext() 
                   select n).ToArray();
      if (enumsWithValues.Length == 0) yield break; //nothing to iterate over
       
      //3. sort by current value in List<IEnumerator<T>>
      var enumsByCurrent = (from n in enumsWithValues 
                  orderby n.Current 
                  select n).ToList();
      //4. loop through
      while (true)
      {
        //yield up the lowest value
        yield return enumsByCurrent[0].Current;

        //move the pointer on the enumerator with that lowest value
        if (!enumsByCurrent[0].MoveNext())
        {
          //remove the first item in the list
          enumsByCurrent.RemoveAt(0);

          //check for empty
          if (enumsByCurrent.Count == 0) break; //we're done
        }
        enumsByCurrent = enumsByCurrent.OrderBy(x => x.Current).ToList();
      }
    }
  }
}

And if this answers any questions for you, please do drop me a line to let me know.

Distributed Cache Library in C#

Last week I had a conversation with another software engineer who asked me about caching. We had an interesting conversation and it was only later that I remembered that I had experimented with a distributed cache library using the precursor to ServiceWire under the covers along with the v3 open edition of ServiceStack.Text for serialization.

It’s funny that I had completely forgotten about this work. It would have been helpful had I remembered it at the time, but now that I’ve dug it back up, it might be a good idea to dust it off and put some finishing touches on it.

Here’s a small test sample:

[TestMethod]
public void TestClientConfigurationAndConnect()
{
   CacheConfiguration.Current.HostStripedNodePort = 8098;
   CacheConfiguration.Current.ClientStripedClusterEndPoints.Add("cluster",
      new IPEndPoint[]
      {
         new IPEndPoint(IPAddress.Parse("127.0.0.1"), 8098) 
      });

   using (var host = new CacheHost("cluster"))
   {
      host.Open();
      using (ICacheClient client = CacheClient.Connect("b", "cluster"))
      {
         Assert.IsNotNull(client);
         int tval = 89;
         int oval;
         client.Write("k", tval);
         var result = client.TryRead("k", out oval, 0, s => 45);
         Assert.IsNotNull(result);
         Assert.AreEqual(tval, oval);
      }
   }
}

This library was designed to create a local in-memory cache or a distributed cache with ease. Have a look at the code and you’ll find that hosting the cache service in any application domain is rather easy. And if you have multiple service hosts, cache items are distributed via hash and buckets allow you to create more than one named cache which allows the same key to be used without conflict. This would allow more than one client process to utilize the distributed cache.

Of course, this code is not ready for prime time. It still has many rough edges and lacks any monitoring or stats collection. As an experiment it was fun to write. And my only excuse for not blogging about it before is that I honestly got busy with other things and forgot about it.

Have fun with it and as always, if you get any good use out of it, I’d love to hear from you

ServiceMq–A Peer to Peer Store and Forward Message Queue in C#

It’s a “catch up on blogging” weekend. Some months ago, while learning more about ZeroMq, I wrote and pushed to GitHub and NuGet a little library called ServiceMq, a peer-to-peer “store and forward” message queue library inspired by what I learned about ZeroMq and incorporating the ServiceWire library I had previously created.

ServiceMq is an experimental library at this point. I have not spent any time thoroughly testing or improving it since I created it. This is not because it’s not a cool project but only because my time has been limited by demands of the day job and family. One must have priorities. That’s what my wife says anyway.

So now with a brief moment of free time, I’m happy to share with you this little bit of work. Let me explain how it works and then I’ll share some test code here to illustrate. If you are interested in it, I urge you to get the NuGet package or clone the code and try it out and let me know if it has been useful to you.

It is also very important for me to mention that I pulled in and renamed the namespaces for neatness within the library the entire ServiceStack.Text v3 code base as the serialization library used by ServiceMq to enable fast and easy serialization using JSON across the wire without burdening the user of the library with having to make any special accommodations with their message DTO classes. You need to know that after v3, the ServiceStack.Text library’s license changed, so if you plan to use it on your own, be aware of the change. The version I’ve used is 100% compatible with the Apache 2.0 license and derivative notice in the code on GitHub.

While the test code below are the only tests I’ve written for the project. They cover only the primary use cases. The tests have both sender and receiver queues in a single process. In practice you would use the library generally in two processes to enable message passing between them.

The store and forward persistence of messages is important for this library as performance was less important than guaranteed sending and receiving. Scale and memory consumption were not addressed in this initial release.

Here’s the order of events on the sending end:

  • Send method first writes the message to a file.
  • Send method then tries to send to the intended recipient.
  • If the send fails, the message is placed on a “failed-retry” queue.
  • If the sending process fails or is shut down, all persisted messages are read back into memory when the process restarts and creates the message queue again.
  • When the message is successfully sent, the outbound message file is deleted after the message content is appended to a rolling outbound log so that an audit of messages sent is possible.

Now here’s the order of events on the receiving end:

  • The message queue receives a message and writes it to a file.
  • The queue’s Receive method is called and pulls a message when it becomes available off the queue and calls Acknowledge method (see more on Acknowledge below).
  • Or the queue’s Accept method is called and pulls a message when it becomes available off the queue but does NOT call the Acknowledge method. This is used by code that may fail to process the message and so the message is not actually removed from the inbound queue.
  • The Acknowledge method is called, either automatically in the Receive method, or manually after the Accept method is used. The Acknowledge method logs by appending the message to the inbound message log and deletes the individual message file.
  • If the receive process fails before the Acknowledge method is called to delete the message file and log it, the incoming queue will read it into memory prior to new messages arriving in order go guarantee order of delivery of the messages.

Now here’s the test code that shows how each end works:

[TestMethod]
public void SimpleTest()
{
    var q1Address = new Address("q1pipe");
    var q2Address = new Address("q2pipe");
    using (var q2 = new MessageQueue("q2", q2Address, @"c:\temp\q2"))
    using (var q1 = new MessageQueue("q1", q1Address, @"c:\temp\q1"))
    {
        q1.Send(q2Address, "hello world");
        var msg = q2.Receive();
        Assert.IsNotNull(msg);
        Assert.AreEqual(msg.MessageString, "hello world");
    }
}

[TestMethod]
public void SimpleTcpTest()
{
    var q1Address = new Address(Dns.GetHostName(), 8967);
    var q2Address = new Address(Dns.GetHostName(), 8968);
    using (var q2 = new MessageQueue("q2", q2Address, @"c:\temp\q2"))
    using (var q1 = new MessageQueue("q1", q1Address, @"c:\temp\q1"))
    {
        q1.Send(q2Address, "hello world");
        var msg = q2.Receive();
        Assert.IsNotNull(msg);
        Assert.AreEqual(msg.MessageString, "hello world");
    }
}

[TestMethod]
public void SimpleObjectTest()
{
    var q1Address = new Address("q6pipe");
    var q2Address = new Address("q8pipe");
    using (var q2 = new MessageQueue("q8", q2Address, @"c:\temp\q8"))
    using (var q1 = new MessageQueue("q6", q1Address, @"c:\temp\q6"))
    {
        int[] data = new int[] { 4, 8, 9, 24 };
        q1.Send(q2Address, data);
        Message msg = q2.Receive();
        Assert.IsNotNull(msg);
        var data2 = msg.To<int[]>();
        Assert.AreEqual(data[1], data2[1]);
    }
}

[TestMethod]
public void SimpleBinaryTest()
{
    var q1Address = new Address("q3pipe");
    var q2Address = new Address("q4pipe");
    using (var q2 = new MessageQueue("q4", q2Address, @"c:\temp\q4"))
    using (var q1 = new MessageQueue("q3", q1Address, @"c:\temp\q3"))
    {
        byte[] data = new byte[] { 4, 8, 9, 24 };
        q1.SendBytes(q2Address, data, "mybytestest");
        Message msg = null;
        while (true)
        {
            msg = q2.Receive();
            if (msg.MessageBytes != null) break;
        }
        Assert.IsNotNull(msg);
        Assert.AreEqual(msg.MessageBytes.Length, 4);
        Assert.AreEqual(msg.MessageBytes[2], (byte)9);
        Assert.AreEqual(msg.MessageTypeName, "mybytestest");
    }
}

I’m certain the code base needs work and needs to be tested under load and limited memory circumstances. Perhaps even a caching strategy needs to be implemented for scenarios where message volume is very high. I look forward to your feedback.

Windows Service in .NET using ServiceRunner

About six months ago I wrote a tiny bit of code that I called ServiceRunner. I put it up on NuGet and GitHub. but never got around to blogging about it until today. And since I’ve already blogged today about writing a Windows Service, it seemed a good time to share.

Why? Because I had grown tired of wiring up a Windows Service host for one project or another and wanted to reduce it down to the very least amount of code possible all while keeping the project as a standard console app to make debugging as simple and easy as possible.

Here is the easiest path to a working Windows Service:

  1. Create a .NET console app in Visual Studio.
  2. Install the NuGet package called ServiceRunner with Install-Package ServiceRunner.
  3. Add a class that inherits from ServiceRunner.ServiceRunnerInstaller as shown below.
  4. Add a simple bit of code to your console app’s Main method as shown below.
  5. Build and debug with the Runner’s runAsConsole constructor parameter set to false.
  6. When ready to deploy as a service, change that parameter to true. How you do that is up to you.
  7. Now run the InstallUtil command line installer as installutil c:\yourpath\yourapp.exe -i and your service is installed and ready to run. (If you use a Visual Studio command line, installutil will be in your path. Otherwise you’ll find it in the .NET framework install directory under C:\Windows\Microsoft.NET\Framework{64}\{version}.)

Here’s the code for the required installer class:

using ServiceRunner;

namespace ServiceRunnerDemo
{
   /// <summary>
   /// This class (name unimportant) must exist in your console app
   /// for the installer to be recognized when you run installutil.exe
   /// from the Windows\Microsoft.NET\Framework64\v4.0.30319 directory.
   /// </summary>
   public class MyInstaller : ServiceRunnerInstaller
   {
      protected override string ServiceName
      {
         get { return "ServiceRunner"; }
      }

      protected override string ServiceDescription
      {
         get { return "Service Runner description"; }
      }

      protected override string ServiceDisplayName
      {
         get { return "Service Runner"; }
      }

      protected override ServiceRunnerStartMode StartMode
      {
         get { return ServiceRunnerStartMode.Manual; }
      }

      protected override ServiceRunnerAccount Account
      {
         get { return ServiceRunnerAccount.LocalSystem; }
      }
   }
}

And here’s the code for the console app Main method.

using System;
using System.IO;
using ServiceRunner;

namespace ServiceRunnerDemo
{
   class Program
   {
      static void Main(string[] args)
      {
         var logFile = "c:\\temp\\logit.txt";
         var runner = new Runner("MyServiceRunnerDemo", runAsConsole: false);
         runner.Run(args, 
            arguments =>
            {
               // equivalent of OnStart
               File.WriteAllLines(logFile, new string[]
               {
                  string.Format("args count: {0}", arguments.Length)
               });
               Console.WriteLine("args count: {0}", arguments.Length);

               // normally you would launch a worker thread here 
               // to do whatever your service would do 

               File.WriteAllLines(logFile, new string[]
               {
                  "start called"
               });
               Console.WriteLine("start called");
            }, 
            () =>
            {
               // equivalent of OnStop
               File.WriteAllLines(logFile, new string[]
               {
                  "stop called"
               });
               Console.WriteLine("stop called");
            });
         Console.ReadLine();
      }
   }
}

As you can see, the code is very simple. Far less to worry about than using the standard Visual Studio project template or trying to manually cobble up the installer and other pieces required. If you get any good use out of it, I would love to hear from you.

Happy Windows Service writing!

Windows Service in the D Programming Language

Nine months ago I blogged about my curiosity about the D programming language. It is possible that this curiosity is turning into a hobby. Time will tell. Recently I decided to create a Windows Service written in the D programming language. I’ll share my journey to that end here.

When I started I assumed it would be easy to find examples in the open source world from which I could learn. That assumption turned out to be mostly false. I found some posts on the forum that were helpful. I then dug up an email address using that famously free detective, Google. Graham Fawcett was very helpful in sharing some code with me but for some reason I could not get it to work.

After a week or more of evenings attempting to find a solution, I gave up and offered a bounty on the forum. And Vladimir Panteleev, a regular D community contributor, came to my rescue and gladly took the $100 bounty to be applied toward other issues he would like to see resolved. My deep thanks to both of these community members.

As it turns out, the code that Graham shared with me would have worked except the Win32 bindings code had an x64 bug that would not have manifested itself had I been compiling for x86. Specifically, the winsvc.d code file contained the line:

alias DWORD SERVICE_STATUS_HANDLE;

I took Vladimir’s advice and changed it to:

alias size_t SERVICE_STATUS_HANDLE;

And then later pulled in his final fix as;

mixin DECLARE_HANDLE!("SERVICE_STATUS_HANDLE");

I won’t try to explain the differences here. In any case, the handle in x64 world needed to be a ulong and it was getting declared as a uint (C# equivalents here). And once that was resolved, I was very happy to see the code work.

You can get or read the code for that first success on GitHub. I refactored that code using a reference app in C++ that followed a familiar pattern having written many Windows Service in C#, even to the point of writing a short cut to standing up a Windows Service in .NET.

In any case, if you are curious to see my first real foray into the world of D programming, you can check out the code on GitHub. It even includes a minor improvement suggested by a forum member already. And if you have questions or need help, please leave a comment and I will do my best to help.