2008-04-04

Binary response in ASP MVC

Today I wanted to give access to certain files on a website only via my DownloadController. This was so that I could ensure the current user had purchased the item in question first, and also sign any license info into the download aswell.



I tried getting a URL like this to work



http://localhost/download/1/SomeFileName



which would remap to the DownloadController



public void Index(int id, string fileName)





This worked fine, and because the URL ended with "SomeFileName" it would get saved as the correct filename too, but this was no use because SomeFileName has no file extension. As soon as I added .zip on the end the request no longer went via the new HttpHandler in the MVC web extensions. Even when I added it in the <httpHandlers> section of web.config it just wouldn’t work.



My problem was in relying on the url for the filename. This is apprarently not the way it should be done. Instead I should have stuck to the standard URL approach



http://localhost/download/1



and added a special HTTP header known as "content-disposition" to the response, this tells the client what the filename should be. Here is a full example of how to write a binary file to the Response when using the new MVC ASP Web Extensions, and how to have it saved on the client with the correct filename.



public void Index(int id)
{
 IProductRepository productRepository = EcoSpace.GetEcoService<IProductRepository>();
 Product product = productRepository.GetByID(id);
 if (product == null)
 {
  ViewData[GlobalViewDataKeys.ErrorMessage] = "Item not found";
  Response.Redirect("/Account/Home", false);
  return;
 }

 Response.ContentType = "Application/" + Path.GetExtension(product.DownloadUrl).Substring(1);
 Response.AppendHeader("content-disposition", "inline; filename=" + product.DownloadUrl);

 string localFileName = "";
 if (product is Edition)
  localFileName = FilePathUrls.Software;
 else
  if (product is Collateral)
   localFileName = FilePathUrls.Collateral;
  else
   throw new NotImplementedException(product.GetType().Name);

 localFileName = Request.MapPath(localFileName);
 localFileName = Path.Combine(localFileName, product.DownloadUrl);

 FileStream fileStream = new FileStream(localFileName, FileMode.Open, FileAccess.Read, FileShare.Read);
 byte[] data = new byte[fileStream.Length];
 using (fileStream)
  fileStream.Read(data, 0, (int)fileStream.Length);
 Response.BinaryWrite(data);
 Response.End();
}



Thanks go to Phil Haak who pointed me in the right direction and was kind enough to promptly help a complete stranger!

2008-04-03

Silverlight and webservices

First download the binaries you need from here:
http://silverlight.net/GetStarted/

Next run VS2008 and create a new project.

Select the Silverlight node and then the Silverlight Application node.

ProjectName = MySilverlightApp
Tick the checkbox "Create directory for solution"
Click OK

On the wizard page you want the default values:
* Add a new page to the solution for hosting the control
Project type = Web Site
Name = MyWebService

Now delete the two ASPX files, we wont be needing those.

Rename the HTML page to Index.html and set it as the project start page.

Right click the website project and select "Add new item".

Select "Web Service".
Name = DateTimeService.asmx
Click ADD

Change the HelloWorld method to

public DateTime GetServerDateTime()
{
  return DateTime.Now;
}


Right-click the References node on the Silverlight project and select "Add service reference".
Click the "Discover" button, and in the tree view that appears select the DateTimeServer node.
Set the NameSpace to DateTimeServer
Click OK

Now open Page.xaml and enter the following within the <Grid>

<StackPanel>
  <Button x:Name="ButtonGetServerDateTime" Content="Get date time" Click="ButtonGetServerDateTime_Click"/>
  <TextBlock x:Name="TextBlockDateTime" Text="Ready..."/>
</StackPanel>


Note: When you type Click=" you will get the option to hit <TAB> to implement the event handler.

Now open Page.xaml.cs

Add the following private member to the class:
private DateTimeServer.DateTimeServiceSoapClient proxy;


Initialise it in the page’s constructor:
proxy = new MySilverlightApp.DateTimeServer.DateTimeServiceSoapClient();
proxy.GetServerDateTimeCompleted +=
new EventHandler<MySilverlightApp.DateTimeServer.GetServerDateTimeCompletedEventArgs>(proxy_GetServerDateTimeCompleted);


Implement the "Completed" method like so:
TextBlockDateTime.Text = e.Result.ToString();


Finally implement the ButtonServerDateTime_Click method like so:
private void GetServerDateTime_Click(object sender, RoutedEventArgs e)
{
  proxy.GetServerDateTimeAsync();
}


That’s it! Run the app and click the button!

2008-03-30

Postal codes within a radius

My hobby MVC website allows people to place adverts. When searching for adverts I would like the user to be able to specify a UK postal code and radius to filter the adverts down to ones within travelling distance. The trick to this was to record a list of UK postal codes and their latitude/longitude.

The first step is to write a routine which will give a straight line distance between to coordinates:

public static class MathExtender
{
  public static double GetDistanceBetweenPoints(double sourceLatitude, double sourceLongitude, double destLatitude, double destLongitude)
  {
    double theta = sourceLongitude - destLongitude;
    double distance =
      Math.Sin(DegToRad(sourceLatitude))
      * Math.Sin(DegToRad(destLatitude))
      + Math.Cos(DegToRad(sourceLatitude))
      * Math.Cos(DegToRad(destLatitude))
      * Math.Cos(DegToRad(theta));
    distance = Math.Acos(distance);
    distance = RadToDeg(distance);
    distance = distance * 60 * 1.1515;
    return (distance);
  }

  public static double DegToRad(double degrees)
  {
    return (degrees * Math.PI / 180.0);
  }

  public static double RadToDeg(double radians)
  {
    return (radians / Math.PI * 180.0);
  }
}


Additionally, on my PostalCode class I have a helper method like so:

public double GetDistanceTo(PostalCode destination)
{
  return MathExtender.GetDistanceBetweenPoints(Latitude, Longitude, destination.Latitude, destination.Longitude);
}


Now the next problem is that these routines are not selectable via SQL and therefore to find all postal codes within a radius of another I would have to load all postal codes into memory and then evaluate them, which I don’t want to do! The DB might be able to use SIN/COS/ACOS etc but if it does then it would be DB specific and I don’t want that either. My decision was to first look for postal codes within a square area, the square being just big enough to encompass the circle, and then to use the in-memory PostalCode.GetDistanceTo() method to whittle the postcodes down to the exact list of matching objects.
The problem remained that I would still need to establish whether Longitude/Latitude coordinates will be within this square. To overcome this I decided that if the world were divided up into a grid of squares each 1 mile in size I could then easily select postal codes within any size square by checking which 1 mile square it is within. So I decided to establish Longitude=0 Latitude=0 as 0,0 on my grid and work out every postal code’s distance from that point. I added two persistent "Double" properties to my postal code class named GridReferenceX and GridReferenceY which I set whenever the Longitude or Latitude is set:

public double GridReferenceX { get; private set; }
public double GridReferenceY { get; private set; }

public double Longitude
{
  get { ..... };
  set
  {
    ......;
    CalculateGridReference();
  }
}

public double Latitude
{
  get { ..... };
  set
  {
    ......;
    CalculateGridReference();
  }
}

private void CalculateGridReference()
{
  //Latitude distance only (X)
  GridReferenceX = MathExtender.GetDistanceBetweenPoints(0, 0, Latitude, 0);
  //Longitude distance only (Y)
  GridReferenceY = MathExtender.GetDistanceBetweenPoints(0, 0, 0, Longitude);
}


Now I have a grid reference for each postal code which really means nothing, but the advantage is that they can be compared to each other like so...

public IList<PostalCode> GetAllWithinRadius(PostalCode postalCode, double radiusInMiles)
{
  List<PostalCode> result = new List<PostalCode>();
  string criteria = string.Format("->select(gridReferenceX >= {0} and gridReferenceX <= {1} and gridReferenceY >= {2} and gridReferenceY <= {3})",
    postalCode.GridReferenceX - radiusInMiles,
    postalCode.GridReferenceX + radiusInMiles,
    postalCode.GridReferenceY - radiusInMiles,
    postalCode.GridReferenceY + radiusInMiles);

  IList<PostalCode> allPostalCodes = BusinessClassesHelper.SelectObjects<PostalCode>(ServiceProvider, "PostalCode", criteria);
  return
    (
      from selectedPostalCode in allPostalCodes
      where postalCode.GetDistanceTo(selectedPostalCode) <= radiusInMiles
      select selectedPostalCode
    ).ToList<PostalCode>();
}



This first uses SQL to exclude all postal codes which cannot possibly be within the radius, then a simple LINQ query returns only the postal codes that are actually within the radius of the specific postal code. Another benefit is that this will also work for postal codes anywhere in the world, I could quite easily check how far (in a straight line) it is from my UK postal code to a US zip code!

2008-03-10

Test Driven MVC and ECO

I have decided that mocking IEcoServiceProvider is not the way to go. Your controller will use the mocked provider during testing but


     
  1. You don’t want to have to mock every service the provider may return, it’s a lot of work!

  2.  
  3. You don’t want your controller using a mocked service, and then the EcoSpace using the real one!



At first I was mocking every possible service request. IUndoService, IPersistenceService, IOclService, etc. I get bored typing them out in this blog, so doing it in tests was really annoying me. I decided I would instead only mock the service in question. So if I were ensuring that an action wont save an object with broken constraints I would mock GetEcoService<IConstraintProvider> and ensure that I always got a broken constraint.

The problem was that the test to ensure I can save a valid object would then invoke the IPersistenceService.UpdateDatabaseWithList method. In my EcoSpace I have decorated my persistence service so that it checks every object in the update list to ensure it has no invalid constraints. At the point it asks for IConstraintProvider it is using the real IEcoServiceProvider and as a result it gets the real IConstraintProvider. In short the object would only save if it were really valid, and not if my mocked constraint provider pretended it was.

Ultimately I found it much easier just to register the mock service on the EcoSpace. To do this all I had to do was to expose a public method on the EcoSpace like so

#if DEBUG
 public void RegisterMockService(type serviceType, object serviceInstance)
 {
  RegisterEcoService(serviceType, serviceInstance);
 }
#endif


Now I can replace any service on the EcoSpace, so even the EcoSpace itself will get mocked services during testing. This is a lot less work!


Talking of a lot less work, although I was initially annoyed that the new field test for the ASP .NET web extensions had switched from using interfaces back over to using objects it turns out that testing is actually much easier than it was previously. Scott Hanselman posted some testing classes on his blog recently, it came out all screwy so I will repost the corrected version below (I hope he doesn’t mind). Testing is now as easy as this...

[TestClass]
public class AccountTests
{
 //This is a descendant of my real EcoSpace,
 //but I replace the persistence mapper with
 //PersistenceMapperMemory after construction so that
 //no DB access is needed.
 MemoryEcoSpace EcoSpace;
 AccountController Controller;
 MockRepository Mocks;
 FakeViewEngine FakeViewEngine;

 [TestInitialize]
 public void SetUp()
 {
  EcoSpace = new MemoryEcoSpace();
  EcoSpace.Active = true;

  Mocks = new MockRepository();
  FakeViewEngine = new FakeViewEngine();
  Controller = new AccountController();
  Controller.ViewEngine = FakeViewEngine;
  using (Mocks.Record())
  {
   Mocks.SetFakeControllerContext(Controller);
  }
 }

 [TestMethod]
 public void Create()
 {
  using (Mocks.Playback())
  {
   Controller.Create();
   Assert.AreEqual("Create", FakeViewEngine.ViewContext.ViewName);
  }
 }

 [TestMethod]
 public void CreateUpdate_ConfirmationEmailDoesNotMatch()
 {
  using (Mocks.Playback())
  {
   Controller.CreateUpdate("Mr", "Peter", "Morris", "me@home.com", "");
   Assert.IsTrue(ValidationHelper.ErrorExists(Controller.ViewData, NotASausageWebsite.Constants.ErrorMessages.ConfirmationEmailAddressDoesNotMatchEmailAddress));
  }
 }
}


public static class ValidationHelper
{
 public static bool ErrorExists(IDictionary<string, object> viewData, string errorMessage)
 {
  if (!viewData.ContainsKey(NotASausageWebsite.Constants.ViewDataKeys.Global.ErrorMessages))
   return false;
  List<string> errors = (List<string>)viewData[NotASausageWebsite.Constants.ViewDataKeys.Global.ErrorMessages];
  return errors.IndexOf(errorMessage) >= 0;
 }
}



Nice and easy! If you want the testing code from Scott here it is:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Web;
using Rhino.Mocks;
using System.Web.Mvc;
using System.Web.Routing;
using System.Collections.Specialized;

namespace Tests.Helpers
{
public static class MvcMockHelpers
{
public static HttpContextBase FakeHttpContext(this MockRepository mocks)
{
HttpContextBase context = mocks.PartialMock<HttpContextBase>();
HttpRequestBase request = mocks.PartialMock<HttpRequestBase>();
HttpResponseBase response = mocks.PartialMock<HttpResponseBase>();
HttpSessionStateBase session = mocks.PartialMock<HttpSessionStateBase>();
HttpServerUtilityBase server = mocks.PartialMock<HttpServerUtilityBase>();

SetupResult.For(context.Request).Return(request);
SetupResult.For(context.Response).Return(response);
SetupResult.For(context.Session).Return(session);
SetupResult.For(context.Server).Return(server);

mocks.Replay(context);
return context;
}

public static HttpContextBase FakeHttpContext(this MockRepository mocks, string url)
{
HttpContextBase context = FakeHttpContext(mocks);
context.Request.SetupRequestUrl(url);
return context;
}

public static void SetFakeControllerContext(this MockRepository mocks, Controller controller)
{
var httpContext = mocks.FakeHttpContext();
ControllerContext context = new ControllerContext(new RequestContext(httpContext, new RouteData()), controller);
controller.ControllerContext = context;
}

static string GetUrlFileName(string url)
{
if (url.Contains("?"))
return url.Substring(0, url.IndexOf("?"));
else
return url;
}

static NameValueCollection GetQueryStringParameters(string url)
{
if (url.Contains("?"))
{
NameValueCollection parameters = new NameValueCollection();

string[] parts = url.Split("?".ToCharArray());
string[] keys = parts[1].Split("&amp;".ToCharArray());

foreach (string key in keys)
{
string[] part = key.Split("=".ToCharArray());
parameters.Add(part[0], part[1]);
}

return parameters;
}
else
{
return null;
}
}

public static void SetHttpMethodResult(this HttpRequestBase request, string httpMethod)
{
SetupResult.For(request.HttpMethod).Return(httpMethod);
}

public static void SetupRequestUrl(this HttpRequestBase request, string url)
{
if (url == null)
throw new ArgumentNullException("url");

if (!url.StartsWith("~/"))
throw new ArgumentException("Sorry, we expect a virtual url starting with \"~/\".");

SetupResult.For(request.QueryString).Return(GetQueryStringParameters(url));
SetupResult.For(request.AppRelativeCurrentExecutionFilePath).Return(GetUrlFileName(url));
SetupResult.For(request.PathInfo).Return(string.Empty);
}

}
}



 public class FakeViewEngine : IViewEngine
 {
  public void RenderView(ViewContext viewContext)
  {
   ViewContext = viewContext;
  }

  public ViewContext ViewContext { get; private set; }

 }

2008-03-07

ECO, LINQ, Anonymous types, and Web Extensions

I’ve been finding LINQ + Anonymous types really compliment ECO and the new ASP web extensions approach to writing websites. I may have mentioned recently that I don’t like the idea of passing instances of my business objects to the presentation layer. The reason is that someone else will be writing the views for this site and I want to be able to control what they are capable of displaying. It’s not just that though, the fact is that your view might need to look completely different to how your business classes are structured, one layer should not dictate the structure of another.

The example I am about to show does in fact have similar structures for the view and model. Having said that there is a slight difference in that the MinorVersion class has its own "int VersionNumber" property, and gets the major part of the version number from self.MajorVersion.VersionNumber. Anyway, now to get on with it.

My requirement was to show all major versions, within each major version show each minor version, and within each minor version show a list of what’s new. In addition, a minor version should only be displayed if its status is #Released, and a major version should not be displayed if it has no minor versions which meet this criteria.


The following code generates a structure like so

MajorVersion (VersionNumber)
 1..* MinorVersion (MajorVersionNumber, MinorVersionNumber)
  1..* WhatsNew (ID, Headline)



and stores the resulting anonymous type into the ViewData for my view to render.

ViewData[GlobalViewDataKeys.WhatsNewKeys.WhatsNewList] = 
 from majorVersion in software.MajorVersions
 where
  (from minorVersionCheck in majorVersion.MinorVersions
   where minorVersionCheck.Status == MinorVersionStatus.Released select minorVersionCheck ).Count() > 0
 select
  new
  {
   VersionNumber = majorVersion.VersionNumber,
   MinorVersions =
    from minorVersion in majorVersion.MinorVersions
    select
     new
     {
      MajorVersionNumber = majorVersion.VersionNumber,
      VersionNumber = minorVersion.VersionNumber,
      WhatsNew =
       from whatsNew in minorVersion.WhatsNew
       select
        new
        {
         ID = whatsNew.ID,
         Headline = whatsNew.Headline
        }
     }
  };
RenderView("AllHistory");




The code behind of my view reads like this:

protected void Page_Load(object sender, EventArgs e)
{
 MajorVersionRepeater.DataSource = ViewData[GlobalViewDataKeys.WhatsNewKeys.WhatsNewList];
 MajorVersionRepeater.DataBind();
}



And finally I use nested ASP:Repeater tags to render the nested HTML.


<ul class="AllHistoryMajorVersionList">
 <asp:Repeater id="MajorVersionRepeater" runat="server">
  <ItemTemplate>
   <li>
    Major version
     <%# DataBinder.Eval(Container.DataItem, "VersionNumber") %>
    <ul class="AllHistoryMinorVersionList">
     <asp:Repeater
      id="MinorVersionRepeater"
      DataSource=’<%# DataBinder.Eval(Container.DataItem, "MinorVersions") %>’
      runat="server">
      <ItemTemplate>
       <li>
        Minor version
        <%# DataBinder.Eval(Container.DataItem, "MajorVersionNumber") %>.
        <%# DataBinder.Eval(Container.DataItem, "VersionNumber") %>
        <ul class="AllWhatsNewList">
         <asp:Repeater
          id="WhatsNewRepeater"
          DataSource=’<%# DataBinder.Eval(Container.DataItem, "WhatsNew") %>’
          runat="server">
          <ItemTemplate>
           <li>
            <a href="/WhatsNew/View/<%# DataBinder.Eval(Container.DataItem, "ID") %>">
             <%# DataBinder.Eval(Container.DataItem, "Headline") %>
            </a>
           </li>
          </ItemTemplate>
         </asp:Repeater>
        </ul>
       </li>
      </ItemTemplate>
     </asp:Repeater>
    </ul>
   </li>
  </ItemTemplate>
 </asp:Repeater>
</ul>



I think the point is that there is just no need to pass your business class instances through to the UI layer. In fact if you later changed the structure of your business classes this LINQ would no longer compile, whereas the markup in the view is evaluated at runtime so you wouldn’t spot an error here until you tried to view the page.

2008-02-22

Custom config sections

The website I am writing will sell some software I have already written. In addition it will sell "Collateral", which is basically support files for the software. The software itself will only run if it finds a license, which is an RSA signed binary file containing information such as the email address of the licensee. In addition some kinds of collateral will also be RSA signed with the licensee’s email address so that it will only work for that user, but not all collateral types are signed, for example a Character is a custom file format and is signed but a WAV file will not be signed.

So this website needs to sell software + provide a license. It also needs to sell collateral, some of which will require signing and some of which will not.

Software and Collateral are both types of Product, and you can buy a Product. The problem is how should I deal with the 3 different types of licensing (license file, signed binary, no license)? In addition to this should I really create a concrete descendant of the "Software" class just to implement a way of providing a license? Erm, no!

If I were to add a concrete class for every type of product it would mean I have to change the model for the website everytime a new product was released, what a pain! From the bottom-up this is what I did.

01: A separate assembly MyWebSite.Licensing with the following two interfaces in it (Edition is linked to Software, so we can buy a license for different editions, Std, Pro, etc):

public interface ISoftwareLicenseGenerator
{
  byte[] Generate(Edition edition, CustomerRole customer);
  string FileExtension { get; }
}

public interface ICollateralLicenseGenerator
{
  byte[] Generate(Collateral collateral, CustomerRole customer);
  string FileExtension { get; }
}


02: Another separate assembly with implementations for the interfaces in them. This assembly is then named something like "MyWebSite.Licensing.SoftwareName".

03: In the web.config file I can then add the following:

<licenseGenerators>
  <software>
    <add uniqueID="SoftwareName" type="MyNameSpace.SoftwareLicenseGeneratorClassName, NameOfAssembly"/>
  </software>
  <collateral>
    <add uniqueID="CollateralTypeName" type="MyNameSpace.CollateralLicenseGeneratorClassName, NameOfAssembly"/>
  </collateral>
</licenseGenerators>


This is parsed when the application is first run, giving me access to a list of generators available which I keep in a Dictionary<string, ......> so that I can look them up by name. If you directly add this kind of section to your web.config your app will not start, so I thought I’d show how to add custom sections + access their values from code....


01: Create a descendant of System.Configuration.ConfigurationSection

public class LicenseGeneratorsSection : ConfigurationSection
{
  public LicenseGeneratorsSection()
  {
  }

  [ConfigurationProperty("software")]
  public SoftwareLicenseGeneratorElementCollection SoftwareLicenseGenerators
  {
    get
    {
      return (SoftwareLicenseGeneratorElementCollection)this["software"];
    }
  }

  [ConfigurationProperty("collateral")]
  public CollateralLicenseGeneratorElementCollection CollateralLicenseGenerators
  {
    get
    {
      return (CollateralLicenseGeneratorElementCollection)this["collateral"];
    }
  }
}

This defines a section with no attributes + 2 element collections, one named "software" and one named "collateral". Next a class is needed for each which descends from System.Configuration.ConfigurationElementCollection so that System.Configuration knows what kind of object to add to these collections when we do <add ......./> within <software> or <collateral>.


02: Element collection, I will list the source only for one...

public class SoftwareLicenseGeneratorElementCollection : ConfigurationElementCollection
{
  //Creates a new instance for the xml <add uniqueID="X" type="Y"/>
  protected override ConfigurationElement CreateNewElement()
  {
    return new SoftwareLicenseGeneratorElement();
  }

  //Gets the key value, in this case the UniqueID property, this avoids duplicates
  protected override object GetElementKey(ConfigurationElement element)
  {
    return ((SoftwareLicenseGeneratorElement)element).UniqueID;
  }
}


03: Finally (for the config settings classes) a class is needed to define the attributes UniqueID and Type.

public class SoftwareLicenseGeneratorElement : ConfigurationElement
{
  public SoftwareLicenseGeneratorElement()
  {
  }

  [ConfigurationProperty("uniqueID", IsRequired = true, IsKey = true)]
  public string UniqueID
  {
    get { return (string)this["uniqueID"]; }
    set { this["uniqueID"] = value; }
  }

  [ConfigurationProperty("type", IsRequired = true)]
  public string Type
  {
    get { return (string)this["type"]; }
    set { this["type"] = value; }
  }
}

04: But before just putting the custom XML into web.config you need to declare this section and associate it with the new ConfigurationSection class

<configSections>
  <section name="licenseGenerators" type="MyWebsite.Configuration.LicenseGeneratorsSection, MyWebsite"/>
</configSections>

this tells ASP .NET to expect a custom section named "licenseGenerators" and to use the LicenseGeneratorsSection class in MyWebsite.dll to determine the structure of it, here it is again:


<licenseGenerators>
  <software>
    <add uniqueID="SoftwareName" type="MyNameSpace.SoftwareLicenseGeneratorClassName, NameOfAssembly"/>
  </software>
  <collateral>
    <add uniqueID="CollateralTypeName" type="MyNameSpace.CollateralLicenseGeneratorClassName, NameOfAssembly"/>
  </collateral>
</licenseGenerators>



To read the config section at runtime is really simple

LicenseGeneratorsSection licenseGeneratorsSection;
licenseGeneratorsSection =
  (LicenseGeneratorsSection)ConfigurationManager.GetSection("licenseGenerators");


Here is the entire class source for the static LicenseGenerator class

public static class LicenseGenerator
{
  static Dictionary<string, ISoftwareLicenseGenerator> SoftwareLicenseGenerators = new Dictionary<string, ISoftwareLicenseGenerator>();
  static Dictionary<string, ICollateralLicenseGenerator> CollateralLicenseGenerators = new Dictionary<string, ICollateralLicenseGenerator>();

  static LicenseGenerator()
  {
    //Read the web.config
    LicenseGeneratorsSection licenseGeneratorsSection;
    licenseGeneratorsSection =
      (LicenseGeneratorsSection)ConfigurationManager.GetSection("licenseGenerators");

    foreach (SoftwareLicenseGeneratorElement currentElement in licenseGeneratorsSection.SoftwareLicenseGenerators)
      SoftwareLicenseGenerators[currentElement.UniqueID] = (ISoftwareLicenseGenerator)GetInstance(currentElement.Type);

    foreach (CollateralLicenseGeneratorElement currentElement in licenseGeneratorsSection.CollateralLicenseGenerators)
      CollateralLicenseGenerators[currentElement.UniqueID] = (ICollateralLicenseGenerator)GetInstance(currentElement.Type);
  }

  private static object GetInstance(string typeName)
  {
    Type objectType = Type.GetType(typeName);
    return Activator.CreateInstance(objectType);
  }

  public static ICollateralLicenseGenerator GetCollateralLicenseGenerator(string uniqueID)
  {
    return CollateralLicenseGenerators[uniqueID];
  }

  public static string[] GetCollateralLicenseGeneratorIDs()
  {
    List<string> result = new List<string>();
    foreach (KeyValuePair<string, ICollateralLicenseGenerator> kvp in CollateralLicenseGenerators)
      result.Add(kvp.Key);
    result.Sort();
    return result.ToArray();
  }

  public static ISoftwareLicenseGenerator GetSoftwareLicenseGenerator(string uniqueID)
  {
    return SoftwareLicenseGenerators[uniqueID];
  }

  public static string[] GetSoftwareLicenseGeneratorIDs()
  {
    List<string> result = new List<string>();
    foreach (KeyValuePair<string, ISoftwareLicenseGenerator> kvp in SoftwareLicenseGenerators)
      result.Add(kvp.Key);
    result.Sort();
    return result.ToArray();
  }
}



This solution allows me to add new products at runtime without any problems. Any time a new type of license generator is required I just create a new assembly with a class that implements the correct interface and then register it in the web.config file!

All users

Yesterday I needed my app to read and write data from a folder to which all users have access. Having the data in the current user's data folder was unacceptible as this would have resulted in duplicate data storages, the MSI installer even generates a compiler warning telling me I shouldn’t use this folder! So I went for Environment.GetFolderPath(SpecialFolder.CommonApplicationData);

This seemed to work fine until I tested on Vista, at which point my app would "stop responding" and quit. With a bit of investigation I discovered that CommonApplicationData maps to c:\ProgramData on Vista, which to me looked good until I tried creating a read/write FileStream in that path and received an access denied exception. So, where was I supposed to store my data? Checking each of the values in the SpecialFolder enum I was surprised to see that there doesn’'t seem to be a suitable value.

So, I reflected over Environment.GetFolderPath and copied the code. I then started at 0 and worked up until I hit a path with the word "public" in it (running on Vista). The value I required was 24! I looked through the SpecialFolders enum and there was no entry for 24, what"s worse is that the GetFolderPath wont allow integer values. So, I had to reimplement it myself.

public static class SpecialFolders
{
public static string GetFolderPath(SpecialFolderEx folder)
  {
    if (!Enum.IsDefined(typeof(SpecialFolderEx), folder))
      throw new ArgumentException("Unknown folder: " + folder.ToString());

    StringBuilder lpszPath = new StringBuilder(260);
    SHGetFolderPath(IntPtr.Zero, (int)folder, IntPtr.Zero, 0, lpszPath);
    string path = lpszPath.ToString();
    new FileIOPermission(FileIOPermissionAccess.PathDiscovery, path).Demand();
    return path;
  }

  [DllImport("shfolder.dll", CharSet = CharSet.Auto)]
  private static extern int SHGetFolderPath(IntPtr hwndOwner, int nFolder, IntPtr hToken, int dwFlags, StringBuilder lpszPath);
}

public enum SpecialFolderEx
{
  ApplicationData = 0x1a,
  CommonApplicationData = 0x23,
  CommonDocuments = 0x2e,
  CommonProgramFiles = 0x2b,
  Cookies = 0x21,
  Desktop = 0,
  DesktopDirectory = 0x10,
  Favorites = 6,
  History = 0x22,
  InternetCache = 0x20,
  LocalApplicationData = 0x1c,
  MyComputer = 0x11,
  MyDocuments = 5,
  MyMusic = 13,
  MyPictures = 0x27,
  Personal = 5,
  ProgramFiles = 0x26,
  Programs = 2,
  Recent = 8,
  SendTo = 9,
  StartMenu = 11,
  Startup = 7,
  System = 0x25,
  Templates = 0x15,
  Windows = 0x24
}


If in future like me you need to store data for all users to access you should use SpecialFoldersEx.CommonDocuments, because now it works a treat!

2008-02-20

Embedded Firebird, error trying to write to file

This error has been really annoying me tonight!

I have an app that uses Embedded Firebird for its DB so that I don't need to
install a DB server. On Vista my app throws an exception "Error trying to
write to file (the correct path here)".

I recreated the DB on my development machine (XP) and tried running it, it should work, it has for months, but it didn't! The same error too!

For the life of me I couldn't work out why it would suddenly stop working on both machines, what did they have in common? I uninstalled stuff, reinstalled it, etc, no joy.

The answer on my XP box was simple. I used the local server to create the GDB file + generate my DB structure using ECO. What I hadn't thought of was the fact that the firebird server then holds a file handle open on that GDB file in case I want to use it again. Embedded firebird needs an exclusive lock on the file so this was the problem on my XP box. I wish the error had read something like "Error trying to write to file, unable to obtain an exclusive lock", would have saved me some time!

However, I don't have the firebird server installed on my Vista test machine so what was causing the problem there? It seems that embedded firebird cannot access the GDB on vista if it is in the CommonApplicationData folder. This is a real pain because

A: I need the database in a common place so that any user using the software will see the same data.
B: This is where it is supposed to go!

So doesn't the current user have sufficient privileges to write to this folder? The following snippet of test code says they do.

static void Main(string[] args)
{
  string fileName = Environment.GetFolderPath(System.Environment.SpecialFolder.CommonApplicationData);
  fileName = Path.Combine(fileName, "MyTest.txt");
  Console.WriteLine("Writing to " + fileName);
  StreamWriter sw = new StreamWriter(fileName);
  using (sw)
    sw.WriteLine("Hello");
  Console.WriteLine("Done");
  Console.ReadLine();
}



I have checked and the database file is in CommonApplicationData on the Vista machine, so it's not as though I am installing into the wrong folder or something either.

The only thing I can think of is that the UAC rules are different between .NET assemblies and native DLLs. It's the only thing I can think of, I really could do with a solution to this!



Pete

2008-02-17

Test driven ECO

Here are my latest revelations :-)

01
Instead of having to mock IEcoServiceProvider and IOclPsService in order to avoid DB access simply use PersistenceMapperMemory. This way I can create the objects I want, UpdateDatabase, and then run my tests. It’s much easier to read, and more importantly less typing.

02
My page controllers no longer use an EcoSpace. Instead the code always uses a ServiceProvider property of type IEcoServiceProvider. When I want to test my controller I create an instance and set its ServiceProvider property. Now whenever the controller needs to do anything it will go through the ServiceProvider I specified.

This is beneficial for a number of reasons. Firstly it means that I can create an EcoSpace in my test and set its PersistenceMapper to PersistenceMapperMemory before activating it. Secondly I can also opt to pass a mocked IEcoServiceProvider which either returns the real service requested or returns a mocked one. An example of this is that I validate my page by using a registered IConstraintProvider interface (defined in DroopyEyes.Eco.Validation). I can check that a controller action wont save a modified object it if is invalid. Instead of having to know how to make the object invalid I just mock the IConstraintProvider and have it always return a single constraint with the expression "false" so that it always fails. In addition, because I know the name of the constraint that is broken, I can then check ViewData["Errors"] and ensure that the controller action has displayed the error messages.

Sure I can just write the action in a minute and know it works, but having these test cases ensures that if someone else modifies my project’s source code without fully understanding what they are doing I will know what they broke. Or, they will know what they broke and can fix it themself!

So there you are. Same end result, less code.

2008-02-14

ECO, should we mock it?

I watched a video on Rhino Mocks yesterday. What a great framework! Obviously I wanted to know if I could use this with ECO so I thought I'd give it a try.

In my website's AccountController there is a method like so

public void AttemptLogin(string emailAddress, string password, string redirectUrl)
{
}


Now I could just go ahead and write some OCL to find the user, but instead of doing this I really want to separate the code a bit. So I created a class

public class UserRepository
{
  private readonly IEcoServiceProvider ServiceProvider;

  public UserRepository(IEcoServiceProvider serviceProvider)
  {
    ServiceProvider = serviceProvider;
  }

  public User GetByEmailAddressAndPassword(string emailAddress, string password)
  {
    string searchEmail = BusinessClassesHelper.EscapeOcl(emailAddress);
    string criteria = string.Format("->select(emailAddress.sqlLikeCaseInsensitive('{0}'))", searchEmail);
    return BusinessClassesHelper.SelectFirstObject<User>(ServiceProvider, "User", criteria);
  }
}


Now I can get my user like so....

public void AttemptLogin(string emailAddress, string password, string redirectUrl)
{
  MyWebsiteEcoSpace ecoSpace = new MyWebsiteEcoSpace();
  ecoSpace.Active = true;
  try
  {
    UserRepository repository = new UserRepository(ecoSpace);
    MyWebsite.Model.User user = repository.GetByEmailAddressAndPassword(emailAddress, password);
  }
  finally
  {
    ecoSpace.Active = false;
  }
}


So what's the benefit? The important thing to note is that I pass in an instance of IEcoServiceProvider to the UserRepository object. So if I want to test the UserRepository class on its own I can pass a dummy object for the serviceProvider. This means that I don't have to access the DB which would slow things down (especially if I have to keep clearing the DB down), in fact I don't even need to connect to the DB at all!

If you remember UserRepository.GetByEmailAddressAndPassword() looks like this
public User GetByEmailAddressAndPassword(string emailAddress, string password)
{
  string searchEmail = BusinessClassesHelper.EscapeOcl(emailAddress);
  string criteria = string.Format("->select(emailAddress.sqlLikeCaseInsensitive('{0}'))", searchEmail);
 return BusinessClassesHelper.SelectFirstObject(ServiceProvider, "User", criteria);
}


and BusinessClassesHelper uses the IOclPsService and IOclService in combination to get to the result. Surely this is all too complicated to mock? Not with Rhino, no!

[TestFixture]
public class UserRepositoryTests
{
  MockRepository Mocks;
  IEcoServiceProvider MockServiceProvider;
  IOclPsService MockOclPsService;
  MyWebsiteEcoSpace EcoSpace;

  [SetUp]
  public void SetUp()
  {
    //Create a mock repository
    Mocks = new MockRepository();

    //Create the mock IEcoServiceProvider
    MockServiceProvider = Mocks.CreateMock<IEcoServiceProvider>();

    //I also need a mock IOclPsService to avoid DB access
    MockOclPsService = Mocks.CreateMock<IOclPsService>();

    //Create a transient version of my EcoSpace
    EcoSpace = new MyWebsiteEcoSpace ();
    EcoSpace.PersistenceMapper = null; //No persistence!
    EcoSpace.Active = true;
  }

  [TearDown]
  public void TearDown()
  {
    EcoSpace.Active = false;
    Mocks.ReplayAll(); //Just in case we forgot, calling twice has no effect!
    Mocks.VerifyAll(); //Ensure everything expected was called
  }

  [Test]
  public void GetUserByEmailAddressAndPassword()
  {
    //Create a list of users to return from the mock IOclPsService
    IObjectList userList = EcoSpace.VariableFactory.CreateTypedObjectList(typeof(User), false);
    
    //Add a single user to that list
    User expectedUser = new User(EcoSpace);
    expectedUser.EmailAddress = "me@home.com";
    expectedUser.SetPassword("1234567890");
    userList.Add(expectedUser.AsIObject());

    //Start specifying what we expect to be called, and what we should do as a result
    Mocks.Record();

    //When GetEcoService<IOclPsService> is called return our MockOclPsService
    Expect.Call(MockServiceProvider.GetEcoService<IOclPsService>()).Return(MockOclPsService);
  
    //Same for GetEcoService(typeof(IOclPsService))
    Expect.Call(MockServiceProvider.GetEcoService(typeof(IOclPsService))).Return(MockOclPsService);

    //When asked for the IOclService (not PS service) return the real one
    Expect.Call(MockServiceProvider.GetEcoService<IOclService>()).Return(EcoSpace.Ocl);
    Expect.Call(MockServiceProvider.GetEcoService(typeof(IOclService))).Return(EcoSpace.Ocl);

    //When MockOclPsService.Execute is executed return our userList
    Expect.Call(MockOclPsService.Execute(null)).Return(userList);
    //This means we don't care what the exact parameter is, any OCL will do
    LastCall.IgnoreArguments();

    //Now go into play back mode
    Mocks.ReplayAll();

    //Create the UserRepository using our mock services
    UserRepository repository = new UserRepository(MockServiceProvider);

    //Ask for the user
    User foundUser = repository.GetByEmailAddressAndPassword(expectedUser.EmailAddress, "1234567890");

    //Ensure that we got the same user back
    Assert.AreEqual(expectedUser, foundUser, "Found the wrong user");
  }
}



Nice eh :-)

2008-02-13

Unit testing MonoRail controllers

I spent yesterday finishing off (mostly) my business model, then the end of yesterday + today writing test cases for those classes. Everything was going great, I found at least 3 errors in my code that I hadn’t realised was there and also realised there were a few more things I needed.

Then it was time to start testing the controllers in my MonoRail site. What a disaster!

Attempt 1:
[Test]
public void AdminOnly_Home()
{
  AdminController controller = new AdminController();
  controller.Home();
  Assert.IsTrue(Controller.Response.WasRedirected, "Should have been redirected");
}


The problem with this was pretty obvious, Controller doesn’t have a Response etc set up. So along came attempt 2:

[Test]
public void AdminOnly_Home()
{
  AdminController controller = new AdminController();
  PrepareController(controller);
  controller.Home();
  Assert.IsTrue(Controller.Response.WasRedirected, "Should have been redirected");
}


Now the controller is set up with mock objects and will run! Unfortunately the BeforeAction filter on my action was not being executed. Aha! Pretty obvious problem! If I call the method directly how can the framework possible find all of the reflection attributes and process them etc? *slaps head*

Attempt 3
[Test]
public void AdminOnly_Home()
{
  AdminController controller = new AdminController();
  PrepareController(controller, "Admin", "Home");
  controller.Process(Controller.Context, Controller.ControllerContext);
  Assert.IsTrue(Controller.Response.WasRedirected, "Should have been redirected");
}


Still no joy! The filters just aren’t being executed. Someone on the user groups said that this is expected behaviour and that the filter should be tested in isolation. Whereas I agree for the most part unfortunately it doesn’t apply in this case. My ECO extensions to MonoRail allow the developer to specify pooling, session, default EcoSpace types, and so on. If these reflection attributes aren’t processed then the action just isn’t going to act in the same way it will at runtime!

At the moment I am sorely disappointed! I was really looking forward to writing a test driven website but unless this guy was wrong it doesn’t look like it is going to be possible!

It’s at times like these I wonder how difficult it really is to write your own MVC framework? Maybe I will take another look at the MS offering. If I had enough free time I'd make my own :-)

2008-02-09

Validation

I have a model like so

Product 1----* ProductVersion
ProductVersion 1----* ProductEdition

ProductVersion can been in one of two states: UnderDevelopment / Released

ProductEdition has a DownloadUrl:string attribute which is only required if self.version.status = #Released


The validation for ProductEdition works perfectly, I cannot leave the DownloadUrl blank if the ProductVersion has already been released. Unfortunately when I already have a number of ProductEdition

instances with no DownloadUrl and then make my Productversion live the editions are not validated because they are not dirty. So I needed some way to ensure that when ProductVersion is validated all related

ProductEdition instances are also validated.

Step 01: Add a way to allow ProductVersion to identify other objects to be validated.

In the business classes project I added the following interface.

public interface IValidationExtender
{
  IEnumerable GetConstraintedObjects();
}


My ProductVersion can do this

IEnumerable IValidationExtender.GetConstraintedObjects()
{
  List result = new List();
  foreach (IObject currentEdition in Editions)
    result.Add(currentEdition.AsIObject());
  return result;
}



Step 02: Create a validation service which validates all objects : Only implemented methods are shown

public class ExtendedConstraintProvider : IConstraintProvider
{
 private IConstraintProvider ModeledConstraintProvider;

 public void GetConstraintsForObject(IObject instance, List constraints)
 {
    if (instance == null)
      throw new ArgumentNullException("instance");

    //Deletegate to GetConstraintsForObjects
    GetConstraintsForObjects((IObjectList)instance.GetAsCollection(), constraints);
  }

  public void GetConstraintsForObjects(IObjectList objectList, List constraints)
  {
    if (objectList == null)
      throw new ArgumentNullException("objectList");
    if (objectList.Count == 0)
      return;

    //Get all constrained objects
    Dictionary includedObjects = new Dictionary();
    foreach (IObject currentObject in objectList)
      RecursiveGetExtendedObjects(currentObject, includedObjects);

    //Add the objects to a list
    IObjectList newInstances = EcoServiceHelper.GetVariableFactoryService(objectList[0]).CreateUntypedObjectList(true);
    foreach (KeyValuePair kvp in includedObjects)
      newInstances.Add(kvp.Key);

    //Return the constraints from ModeledConstraintProvider
    ModeledConstraintProvider.GetConstraintsForObjects(newInstances, constraints);
  }

  private void RecursiveGetExtendedObjects(IObject currentObject, Dictionary includedObjects)
  {
    //Don't process the same object twice
    if (includedObjects.ContainsKey(currentObject))
      return;

    includedObjects.Add(currentObject, null);

    //If the class implements IValidationExtender then add its constrained objects
    IValidationExtender extender = currentObject.AsObject as IValidationExtender;
    if (extender != null)
    {
      foreach (IObject dependentObject in extender.GetConstraintedObjects())
        RecursiveGetExtendedObjects(dependentObject, includedObjects);
    }
  }
}




Step 03: Register the service in the EcoSpace

public InteevoWebsiteEcoSpace(): base()
{
  InitializeComponent();
  RegisterEcoService(typeof(IConstraintProvider), new ExtendedConstraintProvider());
}



Now I can validate a list of dirty objects using EcoSpace.GetEcoService().GetConstraintsForObjects.


Step 04: Last point of defence, ensure that no invalid objects may be saved. Only relevant methods are shown.

internal class ValidatingPersistenceService : IPersistenceService
{
  private IPersistenceService Inner;
  private IEcoServiceProvider ServiceProvider;

  internal ValidatingPersistenceService(IEcoServiceProvider serviceProvider)
  {
    if (serviceProvider == null)
      throw new ArgumentNullException("serviceProvider");

    ServiceProvider = serviceProvider;
    Inner = ServiceProvider.GetEcoService();
    if (Inner == null)
      throw new ArgumentException("ServiceProvider did not provide an instance for IPersistenceService");
  }

  private IConstraintProvider constraintProvider;
  private IConstraintProvider ConstraintProvider
  {
    get
    {
      if (constraintProvider == null)
      {
        constraintProvider = ServiceProvider.GetEcoService();
        if (constraintProvider == null)
          throw new InvalidOperationException("IConstraintProvider not registered as an ECO service");
      }
      return constraintProvider;
    }
  }



  void IPersistenceService.UpdateDatabaseWithList(IObjectList list)
  {
    ValidateObjects(list);
    Inner.UpdateDatabaseWithList(list);
  }

  private void ValidateObjects(IObjectList objects)
  {
    List constraints = new List();
    ConstraintProvider.GetConstraintsForObjects(objects, constraints);
    foreach (DroopyEyes.EcoExtensions.Validation.IConstraint currentConstraint in constraints)
    {
      if (!currentConstraint.IsValid)
      {
        throw new InvalidOperationException(
          string.Format("Cannot update database with invalid objects:\r\n{0} : {1}",
            currentConstraint.Instance.UmlClass.Name, currentConstraint.Name)
        );
      }
    }//foreach constraint
  }
}



Step 05: Replace the standard IPersistenceService in the EcoSpace

public InteevoWebsiteEcoSpace(): base()
{
  InitializeComponent();
  RegisterEcoService(typeof(IPersistenceService), new ValidatingPersistenceService(this));
  RegisterEcoService(typeof(IConstraintProvider), new ExtendedConstraintProvider());
}



Finally I have an implementation which does the following

A: Allows me to get constraints for dirty objects + all relevant objects
B: Prevents the app from saving objects with broken constraints.

2008-02-07

EcoRail

The whole idea of having a controller and a view is so that the view renders only exactly what it is given, and the controller is able to give it whatever data it likes from wherever it needs to obtain it.

After working with ECO and Monorail for a while it has been a real pleasure, but I am starting to think that maybe exposing ECO objects directly to the view is not the right approach.

If for example I put an Employee into the PropertyBag the view can easily display $Employee.Salary. This might not be a problem when you develop both the controllers and the view but in my case someone else will ultimately create the views. Do I really want them to be able to have access to this information? In addition, what if the view engine they use has a scripting language that is able to set values? Setting $Employee will merely set the PropertyBag["Employee"] value, but setting $Employee.Salary could see a certain view developer buying a new car next month.

I am very tempted to change the site whilst it is in its early stages of development. It does seem more logical to have small chunks of data or small classes to pass back and forth between the controller and the view. This is more in line with the design I have in my PocketPC application.

If that is the case it will probably mean that EcoRails is redundant! Actually, only the EcoDataBind part would really be redundant I think, the rest would still be quite useful!

Your bug is my bug

I recently released an update to some software and a bug slipped through the net. It introduced some odd behaviour with a control named SmartGrid. After some testing I was able to determine that it wasn't my fault and that I could reproduce a bug in SmartGrid. I hate bugs in other people's source code, I can't fix it, I am at their complete mercy.

Thankfully the Resco support was amazing! I posted on their forums and immediately someone sent me instructions on where to send my project. The next morning I was disappointed to see an email saying that the project worked fine. I posted again and almost immediately someone had offered to chat on skype.

We did that for a while, both confused by the problem. We then went on to use Remote Assistance so that he could observe my bug which he wasn't experiencing.

In the end the problem was very confusing. I had Version A of the DLL in which the error occurred. I upgraded to the latest version (B) and it still occurred. The guy at Resco sent me a DLL with debug strings being sent to the IDE (C.DEBUG) and everything worked. I reverted to Version B and now it worked whereas before it didn't. Don't you just hate phantom bugs that "fix" themselves?

In the end the Resco guy sent me a full build of the very latest code base (Version C) and all seems fine.

Both of us were at a complete loss, but thanks to their excellent support the problem with my application was kept to as short a time as possible!

2008-02-05

Converting a recurring decimal to a fraction

For a couple of hours a week I am doing a beginner's level maths course. The topic today was converting recurring decimals into fractions. For example (the [] are the repeating digits


0.[6]

 1x = 0.[6]
10x = 6.[6]

9x = 10x - 1x
9x = 6.[6] - 0.[6] = 6


Therefore the answer is 6/9, or 2/3

When it got to numbers like 0.12[34] (ie 0.12343434343434.....) the lesson was really complicated, but I came up with a more simple approach, so here it is if ever you need it.

My first observation was that we need a large number and a small number. So let's start with the 10x we used above


 1x = 0.12[34]
10x = 1.2[34]


therefore 9x = 10x - 1x which is


 1.2[34]
-0.12[34]


This is going to give us 1.1(something)

My second observation is that fractions cannot have decimal values in them. In order to get rid of the fraction we need the big number to have the exact same fraction as the small number. e.g.


 B.[34]
-S.[34]
=X.[00]


So our small number needs to end with [34]. Given the number 0.12[34] how many places do we need to shift the decimal to the right? The answer is 2, so we need the number 1 and two zeros, which is 100

Small number = 100x

To do a subtraction and get a positive number our big number needs to be bigger than 100x


  1x = 0.12[34]
100x = 12.[34]


How many digits are recurring? The answer is 2. This means that the larger factor has two more zeros than the lower factor in order to ensure both numbers end with [34].


    1x = 0.12[34]
  100x = 12.[34]
10000x = 1234.[34]

1234.[34]
- 12.[34]
=========
1222.[00]


Now we have a whole number!

1222 / (10,000 - 100) is 1222/9900


So the quick way of doing it is this


x = 0.12[34]
    0.AA[BB]


Small = x * 10 to the power of the number of non recursive decimal digits [A]
Big = x * 10 to the power of the tge total number of decimal digits [B]

Small = x * 102
Big = x * 104

Big - small = 1222
Big factor (10000) - small factor (100) = 9900

Therefore the answer is 1222/9900


Or like this


1x = 0.12[34]

How many non recurring digits? Two, so our small factor is a 1 with 2 zeros = 100
How many recurring digits? Two, so our large factor has 2 more zeros than the small one = 1 00 00 = 10,000

MaxLength

Implementing HTML maxlength was a bit of a pain. Not to write the helpers though, that was easy....

$EcoModelHelper.AttributeLength($Product, "ID")


But when it came to specifying that in the <input> it was too much work! This is how it is done statically...

$FormHelper.TextFieldValue("Product.ID", $Product.ID, "%{maxlength='32'}")


Now I had to replace the static 32 with the EcoModelHelper code.

#set ($ProductIDLength = $EcoModelHelper.AttributeLength($Product, "ID"))
$FormHelper.TextFieldValue("Product.ID", $Product.ID, "%{maxlength='$ProductIDLength'}")


This was starting to look like too much typing!

So instead I have decided to add new methods to the EcoFormHelper. Here is the first:

$EcoFormHelper.ObjectTextField("Product.ID", $Product, "ID")


This will output something like this

<input type="text" id="Product_ID" name="Product.ID" value="AlterEgo" maxlength="32" />

It just uses the normal MonoRail $FormHelper.TextFieldValue helper but passes it the current value of the object and the maximum length as defined in the model

More work up front, less in the long run :-)

EcoRail validation

Here is yesterday's update.

I wanted a way to validate the user input. Seeing as there are constraints in the model to me this was the obvious approach to take. The HTML in my main layout (MasterPage) was changed like so

<body>
  #if ($Errors && $Errors.Count > 0)
    <ul class="errors">
      #foreach ($currentError in $Errors)
        <li>$currentError</li>
      #end
    </ul>
  #end

  $childContent

</body>


This outputs all errors passed in PropertyBag["Errors"] or in my case I used Flash["Errors"].


To validate my product input I changed my controller like so:

[AllowEcoSpaceDeactivateDirty(true)]
public void Modify([EcoDataBind("Product", Allow = "ID,Name", NoObjectIdAction = ObjectIdAction.CreateNewInstance)]Product product)
{
  PropertyBag["Product"] = product;
  IList<string> errors = GetErrorsForAllDirtyObjects();
  if (errors.Count > 0)
    Flash["Errors"] = errors;
  else
  {
    EcoSpace.UpdateDatabase();
    RedirectToAction("List");
  }
}


GetErrorsForAllDirtyObjects uses the DefaultEcoSpaceType to find the EcoSpace instance and then checks all constraints of all dirty objects in order to return a list of strings. Available validation routines are


protected IList<string> GetErrorsForObject(IObjectProvider instance)

Gets error messages for broken constraints on a single object


protected IList<string> GetErrorsForAllDirtyObjects(Type ecoSpaceType)

Gets the EcoSpace instance of the type specified and then returns errors messages for broken constraints on all modified objects


protected IList<string> GetErrorsForAllDirtyObjects()

Calls GetErrorsForAllDirtyObjects(Type ecoSpaceType) using the DefaultEcoSpaceType specified



Now I have to take into account that not everyone wants to have their error messages returned from OCL constraints defined in the model. To cater for this my validation routines do not directly read the model, instead they use a virtual property

private IConstraintProvider m_ConstraintProvider;
protected virtual IConstraintProvider ConstraintProvider
{
  get
  {
    if (m_ConstraintProvider == null)
      m_ConstraintProvider = new ModeledConstraintProvider();
    return m_ConstraintProvider;
  }
}


The default implementation returns an instance of ModeledConstraintProvider which is a class in the DroopyEyes.Eco.Extensions project, but you can now override this property on your controller and return any implementation you like.

So now I have OCL validation from the model. Next I think I will add an EcoModelHelper so that you can obtain information from the model, to start with I think all I will implement is something like the following

$EcoModelHelper.Length("Person", "FirstName")

2008-02-03

Changing the URL structure

I wanted the following URL structure in my website

www.mysite.com/product/myproductname/whatsnew

but the default url mapping in MonoRail would translate this as

www.mysite.com/[controller]/[action]/[id]

So it would expect to find this

public class ProductController
{
  public void MyProductName(string id)
  {
  }
}

whereas what I actually want is

public class ProductController
{
  public void WhatsNew(string productName)
  {
  }
}



  1. Open Web.Config

  2. Locate the monorail node

  3. Locate the routing child node

  4. Now add a new <rule> to the top of the list:


<rule>
<pattern>/(product)/(\w+)/(\w+)</pattern>
<replace><![CDATA[ /product/$3.rails?productName=$2]]></replace>
</rule>


As this is at the top of the list it will have the highest priority. If the URL matches /product it will remap the url

From:
www.mysite.com/product/myproductname/whatsnew

To:
www.mysite.com/product/whatsnew.rails?productname=MyProductName


without the user ever seeing it :-)

2008-02-02

ECO docs progress

I'm currently in the process of migrating the QuickStart series from BDS over to VS and recreating the accompanying source code.

There's quite a lot of information in those articles, I hadn't realised how much I had written! Well, the transcription is going quite well. So far I have made it as far as article number 5. This one is going a bit slower because it is also a translation from VCL .NET to WinForms.

2008-02-01

Inversion of control

As you may already know I am writing a website. I've chosen to use MonoRail for the web part and ECO for the persistence. Today has been great fun! I have modified the Castle.MonoRail.EcoSupport library with the following enhancements.

  1. You can now specify a [DefaultEcoSpaceType(typeof(MyEcoSpace))] on either the class or method.
  2. On the EcoDataBind reflection attribute you can now specify as little as [EcoDataBind("Product")] on your method parameter, this will use the specified DefaultEcoSpaceType specified, or throw an exception if no default was specified.
This lets me write code like this
[AllowEcoSpaceDeactivateDirty(true)]
[UseEcoSpacePool(false)]
[UseEcoSpaceSession(EcoSpaceStrategyHandler.SessionStateMode.Never)]
public class ProductAdminController : BaseController
{
  public void Create()
  {
    PropertyBag["Product"] = new Product(GetEcoSpace<MyWebSiteEcoSpace>());
    RenderView("Modify");
  }
}
I can now easily create actions Create/Modify/Edit which all render the same view "Modify.vm". The method signatures for Edit and Modify are as follows
public void Modify(string id)
public void Modify([EcoDataBind("Product", Allow="Name", NoObjectIdAction=ObjectIdAction.CreateNewInstance)]Product product)
The EcoDataBind attribute additionally says that only "Name" should be applied to the object, this is to stop people from posting custom requests with Product.IsActive set to "false" for example, so everything except Name will be ignored. It also states that if there is no Product.ExternalId in the form (there wont be for new objects) then a new Product instance should be created.

But now the fun part. Instead of writing code like this to get a product by its name
IEnumerable<Product> products =
GetDefaultEcoSpace().Ocl.Evaluate("Product.allInstances->orderBy(name)").GetAsIList<Product>();
PropertyBag["Products"] = products;
Maybe it would be better to have a data-access-layer so that I don't have to hard-code that OCL whenever I want a list of products? Sounds good....
public interface IProductProvider
{
  IEnumerable<Product> GetAll();
  Product GetSingleByName(string name);
  IEcoServiceProvider ServiceProvider { get; set; }
}
Now the code is written as follows
IProductProvider productProvider = new MyImplementationProductProvider();
productProvider.ServiceProvider = GetDefaultEcoSpace();
PropertyBag["Product"] = productProvider.GetSingleByName(id);
Okay, so now I have a generic way of getting a list of products ordered by name and a single product by its name. What next? Well seeing as I am writing "proper" code these days I thought I'd add some unit testing in there. After having Shamresh from Inspiration Matters talk so enthusiastically about Inversion of Control with me on Skype over the last month or so I thought I'd take a look at that.

Here's how it works. First I put some XML into my web.config
<component id="IProductProvider" 
  service="MyWebsite.Services.IProductProvider, MyWebsite"
  type="MyWebsite.Services.Implementation.ProductProvider, MyWebSite"/>


What this does is to register a type for a service. It says that the class ProductProvider should be used whenever I ask for IProductProvider. Here is how my method implementation changes...
public void Create()
{
  PropertyBag["Product"] = new Product(GetEcoSpace<MyWebSiteEcospace>());
  RenderView("Modify");
}

public void List()
{
  IProductProvider productProvider = WindsorContainer.Resolve<IProductProvider>();
  productProvider.ServiceProvider = GetDefaultEcoSpace();
  PropertyBag["Products"] = productProvider.GetAll();
}

public void Modify(string id)
{
  IProductProvider productProvider = WindsorContainer.Resolve<IProductProvider>();
  productProvider.ServiceProvider = GetDefaultEcoSpace();
  PropertyBag["Product"] = productProvider.GetSingleByName(id);
}

public void Modify([EcoDataBind("Product", Allow="Name", NoObjectIdAction=ObjectIdAction.CreateNewInstance)]Product product)
{
  GetEcoSpace().UpdateDatabase();
  RedirectToAction("List");
}
As you can see I now use WindsorContainer (from www.castleproject.org) to retrieve my IProductProvider instead of creating it directly. You might be wondering what the point of doing that is? The point is that later when it comes to running my unit tests I can configure the WindsorContainer to create a different class to supply IProductProvider. This means that I can have a class specific to my test that returns 3 products that I create on the spot with specific names, then I can inspect the output of the view and check that they are all being rendered.

This is just one example. The main idea is that my unit tests can control what data the controller actions are given so that I can ensure a known data state during the tests instead of having to rely on having certain objects in my database.

It's been good fun, I hope I see some more places I can implement IoC during the development of this website.

2008-01-30

MonoRail

I'm working on a new website for work. I've decided to use ECO for the business model due to how much time it saves me. I took a look at the new MVC ASP approach provided by Microsoft recently and was a bit disappointed. There were bugs in some pretty basic errors that would have been an annoyance to code around, and it just didn't feel "ready".

So, I've decided to take another look at MonoRail. I'd already written an ECO implementation for MR in the past but I decided to start the implementation from scratch. This was mainly inspired by the new EcoSpaceManager in ECOIV for ASP .NET. Using an EcoSpaceManager you can easily utilise many instances of different types of EcoSpace in the same page. I decided I would do the same.

Unlike the EcoSpaceManager I haven't gone for unique string values for identifying the EcoSpace instance I want. That approach is good in ASP .NET where you want to bind different components together to generate your HTML but it doesn't really make as much sense when everything you produce is written in code. If you want two instances of the same EcoSpace you can just use EcoSpaceStrategyHandler.GetEcoSpace().

Anyway, on to the detail:

The first thing I have done is to specify EcoSpace and EcoSpaceProvider settings as reflection attributes on the class (controller) and method (action). Like so

[UseEcoSpacePool(true)]
public class AccountController: EcoSmartDispatcherController
{
  public void Index()
  {
  }

  [UseEcoSpacePool(false)]
  public void SomethingElse()
  {
  }
}
In this example the EcoSpace pool will be used for all actions (methods) except SomethingElse which explicitly says not to use it.

If you want to specify EcoSpace type specific settings in your web app then this is possible too:

[UseEcoSpacePool(true)]
[UseEcoSpacePool(typeof(MyEcoSpace), false)]
public class AccountController: EcoSmartDispatcherController
{
  public void Index()
  {
  }

  [UseEcoSpacePool(typeof(MyEcoSpace), true)]
  public void SomethingElse()
  {
  }
}
In this example the default is to use the EcoSpace pool unless you are retrieving an instance of MyEcoSpace in which case it wont be used. However, if the action being invoked is SomethingElse() then retrieving an instance of MyEcoSpace will use the pool. The order of priority is as follows:
  1. Apply settings on the class that are not specific to an EcoSpace type.
  2. Apply settings on the method that are not specific to an EcoSpace type.
  3. Apply settings on the class that are specific to an EcoSpace type.
  4. Apply settings on the method that are specific to an EcoSpace type.
So the order of priority is that Method settings override Class settings, and then EcoSpace type specific settings override non specific settings.

Pretty versatile eh? But how do you get an instance of an EcoSpace? Well, that's pretty simple too!
public void Index()
{
  MyEcoSpace myEcoSpace = GetEcoSpace<MyEcoSpace>();
  //This will return the same instance
  MyEcoSpace myEcoSpace2 = GetEcoSpace<MyEcoSpace>();
}
Any EcoSpaces requested in this manner will automatically be "released" immediately after the method finishes executing so you don't need to worry about it at all. I say "released" because it will either be Disposed, stuffed into the session, or returned to the pool based on your EcoSpaceProvider settings for this class/method/EcoSpace type.

That's not the end of the story though. You can bind instances of your objects directly to HTML and back. To do this you need to identify your business class with a reflection attribute "EcoDataBind", like so
[AllowDeactivateDirty(true)]
public void Create(
  [EcoDataBind(typeof(MyEcoSpace), "Product", CreateIfNoObjectId=true)]Product product)
{
  PropertyBag["Product"] = product;
}
Here I have stated that it is okay to Dispose the EcoSpace instance if it contains dirty objects at the end of this method. The method itself consists of a single parameter named "product" of type "Product". The EcoDataBind might look a bit overwhelming but it says this
  1. The EcoSpace type you need to home the Customer object is MyEcoSpace.
  2. The prefix in the form to look for is "Product", so when you see <input name="Product.Name">> MonoRail knows that the value should go into the Name property of this product.
  3. If there is no ExternalId in the form identifying the object to fetch from the data storage before updating then a new instance should be created.
As a result when you run your app and go to localhost/account/create you will see that the product method has a new instance in there. When the user posts the form back you will again see an instance but containing the updated values. What does the HTML look like? I have used the Brail view engine and HTML helpers to output the HTMl I need. This allows you to use whatever HTML you like but then easily add the <input> etc based on your current object.

${HtmlHelper.Form('create.rails')}

  ${EcoFormHelper.PersistedExternalId('Product.ExternalId', Product)}

  Name : ${HtmlHelper.InputText('Product.Name', Product.Name)}

  Current version number : ${HtmlHelper.InputText('Product.CurrentVersionNumber', Product.CurrentVersionNumber)}

  ${HtmlHelper.SubmitButton('Save')}

${HtmlHelper.EndForm()}
I have split the lines up a bit to make them easier to visually separate.

First an EcoFormHelper is told to output a hidden input named Product.ExternalId for the product we set in the C# method (see PropertyBag["Product"] = product). EcoFormHelper.ExternalId will output the ExternalId for the object, PersistedExternalId will only output if the object is not new, this is useful in situations like this when the object was disposed of with the EcoSpace it belong too and we can just create a new instance.

Next the HtmlHelper gives us an <input> named "Product.Name" and its value is set to whatever is in the Product's Name property. The same is done for CurrentVersion.

A Submit button is then generated so that the user may post their changes.

Summary
Well, this little example shows that I can implement a nice clean MVC style approach to writing web apps with ECO and not have to worry about constructing EcoSpace instances in code, fetching objects from the data storage and so on manually; everything is done for me.

2008-01-29

What's in a name?

When thinking of a name for a new product I wish people would be more original. For example, I have just spent ages looking for information about how to create a ternary in a scripting language known as Brail. Instead of finding what I want I have had to work my way through loads of information about blind people.

2008-01-24

Single application instance

I needed my app to be a single-instance app. It was easy to implement, like so:

bool mutexIsNew;
Mutex mutex = new Mutex(true, "SomeUniqueID", out mutexIsNew);
if (mutexIsNew)
Application.Run(new MainForm());


A problem arose though when my app started to need to receive command line arguments. How do you also pass those onto the original app? There is a class for this purpose named
"WindowsFormsApplicationBase", it's in Microsoft.VisualBasic.dll

01: Add a class to your project like so:
internal class SingleInstanceApplication : WindowsFormsApplicationBase
{
private MainForm MainFormInstance;
internal SingleInstanceApplication()
{
IsSingleInstance = true;
EnableVisualStyles = true;
MainFormInstance = new MainForm();
MainForm = MainFormInstance;
}

protected override bool OnStartup(StartupEventArgs eventArgs)
{
return base.OnStartup(eventArgs);
MainFormInstance.AcceptCommandArguments(eventArgs.CommandLine);
}

protected override void OnStartupNextInstance(StartupNextInstanceEventArgs eventArgs)
{
base.OnStartupNextInstance(eventArgs);
MainFormInstance.AcceptCommandArguments(eventArgs.CommandLine);
}
}


02: Add a method to your MainForm class like so:
internal void AcceptCommandArguments(IList args)
{
}


03: Finally change your project source code like so:

[STAThread]
static void Main(string[] args)
{
Application.EnableVisualStyles();
Application.SetCompatibleTextRenderingDefault(false);
SingleInstanceApplication singleInstanceApplication = new SingleInstanceApplication();
singleInstanceApplication.Run(args);
}

2008-01-22

ExternalID is not related to ECO_TYPE

I just thought I'd blog about this because it seems to be causing some confusion.

ECO_TYPE
Use:
In integer used in the database to identify the type of object a table row represents.

Obtained from:
The ECO_TYPE table using the name of the class.

Life span:
Permanent. The name of the class in ECO_TYPE changes when you rename your class, but the integer value never changes.


ExternalId
Use:
A way of identifying a class instance across EcoSpace instances. Kind of like a string version of a pointer to an object.

Obtained from:
Index of the class in TypeSystem.AllClasses + "!" + the ECO_ID of the instance.

Life span:
Although the ECO_ID never changes the class index might when you add or remove classes within the model.


I just want to point out that ExternalID and ECO_TYPE are unrelated. When you change your model it does not mean that the ECO_TYPE in your DB must change. Here is an example

01: You create a model with 2 classes in it. The first class is the root, the next is a subclass
ClassA <---- ClassC

02: You generate the DB for the first time. This creates the ECO_TYPE table data with the following values

ECO_TYPE CLASSNAME
1 ClassA
2 ClassC

03: You create an instance of ClassA and ClassB then look in the ClassA table in the DB, you will see something like this

ECO_ID ECO_TYPE
1 1
2 2


04: Retrieving the ExternalID for each class will return 1!1 and 2!2. This looks like the ECO_TYPE but it isn't!

05: You now change the model like so
ClassA <---- ClassB <---- ClassC

06: You evolve your DB.

The result of this is the following

A: A new entry in ECO_TYPE to enable ECO to see when a class is an instance of ClassB

ECO_TYPE CLASSNAME
1 ClassA
2 ClassC
3 ClassB

Note how the types are unchanged. There is no specific order to this list at all. It's a first-come-first-served kind of arrangement.

B: Your ExternalID's now change. The class index returned for each class is
ClassA = 1
ClassB = 2
ClassC = 3

Note that these have no relation at all to the ECO_TYPE. Just because they are both integers means nothing :-)

I hope this clears it up. Remodeling does not mean all your DB type data needs to be updated!


Pete

2008-01-20

Why I don't use ExternalID for URLs

An External ID consists of two parts, !

ClassID isn't really required for ECO controlled DB structures because there is always a root table containing the ObjectID + ClassID, but when you use ECO to map to an existing DB structure there needs to be a way to know that object 1234 needs to be fetched from the PERSON table.

So, now that we know why the ExternalID consists of two parts on to why I don't use it (much). If I am writing a website with ECO and I use the ExternalID for my URL like so

www.mysite.com/showarticle.aspx?id=23!1234

The number 23 is determined by looking for the class's index in the list of all classes in the model. This is the problem! If you change your model by adding a new class then your index may change from 23 to 24. Not a big problem for your application, but persistent links to that URL from other sites such as Google will no longer work.

An ExternalID is just like an object's pointer address. When you restart your application that address is no longer valid. ExternalID lives longer than an application restart, but doesn't always live past an application remodeling as explained above.

My tips are as follows:
  1. For passing object instances around in code, just past the object instance itself.
  2. For passing a reference to an object to another EcoSpace instance in the same application use ExternalID.
  3. For storing away a reference for an unknown period of time (config file settings, URLs) add a unique attribute to your class such as an AutoInc and retrieve it using that value in an OCL evaluation.
URLs are really the only example I can think of where an ExternalID is persisted externally and expected to live longer than the life of the current model. In this case I think the following URL is much more pleasing to the eye anyway

www.mysite.com/showarticle.aspx?id=1234

2008-01-17

RSA signed streams

I'm writing an app that needs me to be able to write to a file and make sure that nobody changes it. The idea that I had was to create 2 classes:

  1. WritableSignedStream:

    Accepts a targetStream parameter + a privateKeyXml string. I add some space at the beginning of the target stream to reserve it for header info. You then write to the stream as normal (it reports a smaller Length than that of the target stream so your app is unaware of the header). When you close the stream it writes the public key to the stream + an RSA encrypted hash of the data part of the file.
  2. ReadableSignedStream:

    Accepts a sourceStream parameter in the constructor. Just before you read for the first time it will compute an MD5 hash of the data part of the file, and then compare it with the signed hash in the file (which I first decrypt using the stored public key).

These classes therefore provide two functions:

  • You can sign a stream, the app can check the stream was signed by you by inspecting the public key within it.
  • You can additionally pass a publicKeyXml to the ReadableSignedStream constructor and it will use that instead of the public key within the outer stream, this allows you to ensure nobody has altered your file in any way.

Anyway, to the point. I could always use RsaCryptoServiceProvider.Encrypt() to encrypt the MD5 hash, but whenever I tried to DeCrypt() I would get an exception telling me that my public key was a "Bad key".

It annoyed me for a while, but then I realised something pretty obvious really. The way RSA works is that you are supposed to Encrypt using the Public key and DeCrypt using the Private key. I was using it the wrong way around. The solution was simple....

(Signing)
RSACryptoServiceProvider RsaProvider;
RsaProvider = new RSACryptoServiceProvider();
RsaProvider.FromXmlString(privateKeyXml);

MD5 md5 = new MD5CryptoServiceProvider();
byte[] md5HashBuffer = md5.ComputeHash(dataStream);
byte[] signedMD5HashBuffer = GetSignedMd5Hash(md5HashBuffer);


private byte[] GetSignedMd5Hash(byte[] hashBuffer)
{
RSAPKCS1SignatureFormatter rsaFormatter = new RSAPKCS1SignatureFormatter();
rsaFormatter.SetKey(RsaProvider);
rsaFormatter.SetHashAlgorithm("MD5");
return rsaFormatter.CreateSignature(hashBuffer);
}


(Verifying)
MD5 md5 = new MD5CryptoServiceProvider();
byte[] currentMD5HashBufferForFile = md5.ComputeHash(dataStream);
RSACryptoServiceProvider rsaProvider = new RSACryptoServiceProvider();
rsaProvider.FromXmlString(PublicKeyXml);


RSAPKCS1SignatureDeformatter rsaDeFormatter = new RSAPKCS1SignatureDeformatter();
rsaDeFormatter.SetKey(rsaProvider);
rsaDeFormatter.SetHashAlgorithm("MD5");
if (!rsaDeFormatter.VerifySignature(currentMD5HashBufferForFile, FileEncryptedMd5Hash))
throw new Exception("Someone has messed with the file");

I hope someone finds this useful :-)

No symbols loaded

This has been driving me mad for hours now! Whenever I run my PocketPC compact framework app I cannot debug it! None of the breakpoints will stop, each breakpoint just shows as an empty circle instead of a solid one.

So, what was the solution? I tried deleting all PDB files on my hard disk but that didn't do it. In the end manually deleting all of the files previously deployed to my PPC did the trick. Maybe VS couldn't overwrite them or something? No idea, but at least it works now :-)

2008-01-15

Geek quotes

This one is probably my favourite:
"There are 10 types of people in the world, those who understand binary and those who don't".

Here is one I came up with myself some years ago and used to use in my newsgroup signature:
"Blessed are the geek, for they shall public class GeekEarth : Earth".