Monday, November 16, 2009

Image to Byte Array

In Some ways to convert a byte array to Bitmap we see how to convert a Bitmap to and from a ByteArray. As an alternative I use the following extension method to convert an image to a ByteArray:

[csharp]

using System.Drawing;
namespace DC.Core.Extension
{
public static class ImageExtension
{
private static ImageConverter _converter;

public static byte[] ToByteArray(this Image image)
{
if (_converter==null) _converter = new ImageConverter();
return (byte[])_converter.ConvertTo(image, typeof(byte[]));
}
}
}
[/csharp]

Thursday, September 24, 2009

F# and Units of Measure

On Channel 9 they have an interview with Andrew Kennedy about the new Unit of Measure feature in F#.

This is a really cool feature and I have plans to use it, however there is one problem I have with it: The UOM information is baked into the compiler and can't be persisted.

An example that Andrew uses it the NASA's Mars Climate Orbiter that crashed due to an issue with Units of Measure (long story short the program thought it was using Newtons when it was actually using Pounds Force). This is a good example of how when you are doing calculations it would be nice if something other than the brain of the programmer was checking the units. But there is another time that UOM is important: When inputting the data.

I recently worked on an application where the units for a given value could be in English or SI (metric) units and we had an elaborate system to deal with them. The end user could even come up with their own unit system if they wanted to and the application had to be able to handle that.

Internally and for storage we degreed that all units would be in metric, no If's And's or But's about it. It was only when we displayed the values to the end user did we convert it to their chosen unit system. And for importing data we required that the imported values have their UOM indicated.

The F# UOM feature is a truly remarkable step in solving one of my pet issues. The next step will have to include a way to persist the UOM information with the data to any data store because without this there is no way to guarantee that the data values coming in are really what we say they are.

Thursday, May 21, 2009

RESTful Architecture

Found the following in the article ASP.NET MVC: Using RESTful Architecture and thought it worth noting. It describes what the MVC command actions should be, no more than this is needed:
The endpoint needs to be something meaningful, and Rails uses a nice convention that divides the endpoints into 7 main bits:

  • Index - the main “landing” page. This is also the default endpoint.

  • List - a list of whatever “thing” you’re showing them - like a list of Products.

  • Show - a particular item of whatever “thing” you’re showing them (like a Product)

  • Edit - an edit page for the “thing”

  • New - a create page for the “thing”

  • Create - creates a new “thing” (and saves it if you’re using a DB)

  • Update - updates the “thing”

  • Delete - deletes the “thing”


Normally the last 3 are “action only ” and don’t have a view associated with them. So if you “create” a Product (from the New view, using Create as the action on the form), you’d just redirect then to the List or Edit views. Likewise if you Update a Product from the Edit page (using Update as the action on the form) you might want to go back to the Edit view and show a status update.

I don't agree that this should be the ONLY command results you can have but I think it is a good guideline to follow. Now I have to go update my MVC commands.

Sunday, April 12, 2009

Thomas dancing to I want Candy

Aunt Tammy sent Thomas and Zoey musical cards for Easter. Here we catch Thomas enjoying his.

Thursday, April 02, 2009

Json for jqGrid from ASP.Net MVC

jqGrid takes a specific format for its json (taken from jqGrid documentation):

[js]{
total: "xxx",page: "yyy", records: "zzz",
rows : [
{id:"1", cell:["cell11", "cell12", "cell13"]},
{id:"2", cell:["cell21", "cell22", "cell23"]},
...
]}[/js]

The tags mean the following:

total - Total number of Pages.
page - Current page Index.
records - Total number of records in the rows group.
rows - An array with the data plus an identifier.
id - The unique row identifier, needs to be an int from what I have found.
cell - An array of the data for the grid.

The ASP.Net MVC framework has the JsonResult response type which we can use to populate the jqGrid. As an example I created a Person model and a method to return some data:

[csharp]
public class Person
{
public int ID { get; set; }
public string Name { get; set; }
public DateTime Birthday { get; set; }
}

public IEnumerable<Person> GetABunchOfPeople()
{
yield return new Person {ID = 1, Name = "Darren", Birthday = new DateTime(1970, 9, 13)};
yield return new Person {ID = 2, Name = "Dawn", Birthday = new DateTime(1971, 6, 1)};
yield return new Person {ID = 3, Name = "Thomas", Birthday = new DateTime(1995, 10, 3)};
yield return new Person {ID = 4, Name = "Zoey", Birthday = new DateTime(1997, 8, 15)};
}
[/csharp]

Generating the JSON is as follows, this going into a PersonModel class:

[csharp]
public JsonResult GetABunchOfPeopleAsJson()
{
var rows = (GetABunchOfPeople()
.Select(c => new
{
id = c.ID,
cell = new[]
{
c.ID.ToString(),
c.Name,
c.Birthday.ToShortDateString()
}
})).ToArray();
return new JsonResult
{
Data = new
{
page = 1,
records = rows.Length,
rows,
total = 1
}
};
}[/csharp]


The controller then would look like:
[csharp]
public class PersonController : Controller
{
public ActionResult Index()
{
return View();
}
public JsonResult GetAllPeople()
{
var model = new Models.PersonModel();
return model.GetABunchOfPeopleAsJson();
}
}

[/csharp]
The view doesn't need to inherit any model other than View page if all you want is for the jqGrid to show the data. In the view, Index.aspx in this case, you add table and div for the jqGrid:

<table id="dictionary" class="scroll" cellpadding="0" cellspacing="0"></table>
<div id="pager" class="scroll" style="text-align: center;"></div>

Configuring jqGrid is:
[js highlight="6,8"]
$(document).ready(function() {

$("#dictionary").jqGrid({
caption: "Tank Dictionary",
pager: $("#pager"),
url: '<%= ResolveUrl("~/Person/GetAllPeople") %>',
editurl: '&lt;%= ResolveUrl("~/Person/Edit") %&gt;',
datatype: 'json',
myType: 'GET',
colNames: ['ID', 'Name', 'Birthday'],
colModel: [
{ name: 'ID', index: 'ID', width: 150, resizable: true, editable: false },
{ name: 'Name', index: 'Name', width: 200, resizable: true, editable: true },
{ name: 'Birthday', index: 'Birthday', width: 300, resizable: true, editable: true }
],
sortname: 'ID',
sortorder: 'desc',
viewrecords: true,
height: '100%',
imgpath: '<%= ResolveUrl("~/Scripts/jquery/jqGrid/themes/basic/images") >'
});
[/js]
The url tag is set to call the Person controller to get the JSON results. Note the datatype is json.

The editurl tag will be talked about later.

Friday, March 27, 2009

LINQ to SQL Roundtrips: SQL Trace

I was looking into reducing the number of database round trips that LINQ to SQL took and found an article by David Hayden that fit the bill. I wanted to see what was actually happening so I slapped together a simple demo.

Using the Pubs database I created a console app, added the LINQ to SQL classes then created a simple repository class:

public class AuthorRepository
{
    public author GetAuthorWithTitles(string authorId)
    {
        var db = new PubsDataClassesDataContext();
        return db.authors.FirstOrDefault(a => a.au_id == authorId);
    }
 
    public author GetAuthorWithTitlesWithUsing(string authorId)
    {
        using (var db = new PubsDataClassesDataContext())
            return db.authors.FirstOrDefault(a => a.au_id == authorId);
    }
 
    public author GetAuthorWithTitlesPrefecth(string authorId)
    {
        using (var db = new PubsDataClassesDataContext())
        {
            var options = new DataLoadOptions();
            options.LoadWith<author>(a => a.titleauthors);
            options.LoadWith<titleauthor>(ta => ta.title);
 
            db.LoadOptions = options;
            return db.authors.FirstOrDefault(a => a.au_id == authorId);
        }
    }
}

I created a simple method to dump the author and titles:

class Program
{
    static void Main(string[] args)
    {
        var repo = new AuthorRepository();
 
        DumpAuthorToConsole(
            repo.GetAuthorWithTitles("998-72-3567")
            , "Authors without prefetch and without using statement");
 
        DumpAuthorToConsole(
            repo.GetAuthorWithTitlesWithUsing("998-72-3567")
            , "Authors without prefetch and but with using statement");
 
        DumpAuthorToConsole(
            repo.GetAuthorWithTitlesPrefecth("998-72-3567")
            , "Authors with prefetch and with using statement");
    }
 
    private static void DumpAuthorToConsole(author author, string message)
    {
        Console.WriteLine();
        Console.WriteLine(new string('-', 50));
        Console.WriteLine(message);
 
        try
        {
            Console.WriteLine("Author Name: {0} {1}", author.au_fname, author.au_lname);
            Console.WriteLine("Count of Titles: {0}", author.titleauthors.Count);
 
            foreach (var titleauthor in author.titleauthors)
                Console.WriteLine("\tBook Title: {0}", titleauthor.title.title1);
        }
        catch (Exception e)
        {
            Console.WriteLine(">>> FAIL! <<<");
            Console.WriteLine(e.Message);
        }
    }
}

>I did cheat and added some titles to the chosen author just so I could have at least five titles returned. The results are:

--------------------------------------------------
Authors without prefetch and without using statement
Author Name: Albert Ringer
Count of Titles: 5
Book Title: Silicon Valley Gastronomic Treats
Book Title: Secrets of Silicon Valley
Book Title: Computer Phobic AND Non-Phobic Individuals: Behavior Variations
Book Title: Is Anger the Enemy?
Book Title: Life Without Fear

--------------------------------------------------
Authors without prefetch and but with using statement
Author Name: Albert Ringer
>>> FAIL! <<<
Cannot access a disposed object.
Object name: 'DataContext accessed after Dispose.'.

--------------------------------------------------
Authors with prefetch and with using statement
Author Name: Albert Ringer
Count of Titles: 5
Book Title: Silicon Valley Gastronomic Treats
Book Title: Secrets of Silicon Valley
Book Title: Computer Phobic AND Non-Phobic Individuals: Behavior Variations
Book Title: Is Anger the Enemy?
Book Title: Life Without Fear
Press any key to continue . . .

Notice that there are three cases above, with one failing.

What are we looking at?

Case 1: Authors without prefetch and without using statement

This is the GetAuthorsWithTitles() method, it simple creates the DataContext and returns the author object. Simple, clean and easy, but not very effecient.

First note that the DataContext was not disposed so it is still lingering out there, waiting to be garbage collected. The SQL trace looks like this:




The first sp_executesql loads the author, the second loads all the titleauthor rows for the author and the rest select each individual title from the titles table. So for one author, a count of his titles and a list of each title requiers seven round trips to the database. Notice that each time a round trip is made a connection has to be created.

Case 2: Authors without prefetch and but with using statement

This is the GetAuthorsWithTitleWithUsing() method, which is the same as before in that it simple creates a DataContext and returns the author but then properly disposes of the context. This fails to return the count of the titles and the individual titles because the lazy loading cannot happen on a disposed context so an exception is thrown in this case.

The trace only shows one sp_executesql:


Case 3: Authors with prefetch and with using statement

In this case we use the DataLoadOptions to tell LINQ to load the titleauthors with the author and to load the titles with the titleauthors. The DataContext is disposed after returning the author.




Notice that two sp_executesql's were executed and that both were on the same connection. The first sql returned the author:

exec sp_executesql N'SELECT TOP (1) [t0].[au_id], [t0].[au_lname], [t0].[au_fname], [t0].[phone], [t0].[address], [t0].[city], [t0].[state], [t0].[zip], [t0].[contract]
FROM [dbo].[authors] AS [t0]
WHERE [t0].[au_id] = @p0',N'@p0 varchar(11)',@p0='998-72-3567'

the second returned the titleauthor and titles:

exec sp_executesql N'SELECT [t0].[au_id], [t0].[title_id], [t0].[au_ord], [t0].[royaltyper], [t1].[title_id] AS [title_id2], [t1].[title] AS [title1], [t1].[type], [t1].[pub_id], [t1].[price], [t1].[advance], [t1].[royalty],
[t1].[ytd_sales], [t1].[notes], [t1].[pubdate]
FROM [dbo].[titleauthor] AS [t0]
INNER JOIN [dbo].[titles] AS [t1] ON [t1].[title_id] = [t0].[title_id]
WHERE [t0].[au_id] = @x1',N'@x1 varchar(11)',@x1='998-72-3567'

Leasons Learned

First beware of disposing the DataContext when you want to LazyLoad entities. If you are going to do this then come up with a way to properly dispose of the context.

Second round trips to the database can be reduced by using the DataLoadOptions. This is not a guaranty of better performance but it is a step in the right direction.

Friday, March 13, 2009

DataTable: Finding Differences in Column Values

I have two tables in a typed DataSet and I want to compare one column in each table to see if TableA has values that are not in TableB.

IEnumerable<string> valuesInA = typedDataSet.TableA.AsEnumerable().Select(row => row.AField);
IEnumerable<string> valuesInB = typedDataSet.TableB.AsEnumerable().Select(row => row.BField);
 
foreach (var notInB in valuesInA.Except(valuesInB))
    Debug.WriteLine(string.Format("Value not in TableB.BField: {0}", notInB));

This is assuming that both AField and BField are the same type.

Wednesday, February 25, 2009

DataSet: More reasons to not like it

So I am still living in the world of DataSets, when will I ever learn?

The task was simple, take the contents of DataSetA and merge it into DataSetB. Sure, just use:

DataSetA.Merge(DataSetB, true)

and life will be good. But wait! Why is it when I try to save the merged data from DataSetB to the database it doesn't show up?

Because the RowState of all the rows in all the tables are set to Unchanged! And a merge operation does not change the RowState. So if I want all the data from B to be saved to A I have to change all the row states of all the tables to Added. Sure, I should be able to loop thorough it all and set the row's state as such:

row.RowState = DataRowState.Added;

Not so fast grasshopper! row.RowState does not have a setter! Isn't that Asinine! You have to use:

row.SetAdded();

What ever, I say setter, you say method. Long story short I created an extension method that works.

public static DataSet MergerAllDataAsNew(this DataSet target, DataSet source)
{
foreach (DataTable table in source.Tables)
foreach (DataRow row in table.Rows)
row.SetAdded();

target.Merge(source, true);

return target;
}

Brute Force works.

Wednesday, January 28, 2009

Using IDataErrorInfo for Validation

I read an article on CodeProject titled Total View Validation where the author complains that IDataErrorInfo is inadequate for WPF validation.  The assumption he makes is that all the validation code needs to go into the IDataErrorInfo.this[string] property as such:

[csharp]
public string this[string name]
{
get
{
string result = null;
if (name == "Age")
{
if (this.age < 0 || this.age > 150)
{
result = "Age must not be less than 0 or greater than 150.";
}
}
return result;
}
}
[/csharp]

This should not be the way that IDataErrorInfo is used as it puts business logic in the model, as the author points out. His solution though was to create another mechanism to notifiy the user of validation errors and to not use IDataErrorInfo at all.

A solution I would propose though would be to follow how DataTables and DataTables use IDataErrorInfo by implementing methods to set and clear the objects error information:

[csharp]
public class SomeClass : IDataErrorInfo
{

public string Name { get; set; }

public string Age { get; set; }

#region Data Error Info

private Dictionary<string, string> _propertyErrors;

private void InitDataErrorInfo()
{
var properties = this.GetType().GetProperties();

_propertyErrors = new Dictionary<string, string>
{
// This will act as an overall error message for the entire object.
{this.GetHashCode().ToString(CultureInfo.InvariantCulture), string.Empty}
};


foreach (var propertyInfo in properties)
_propertyErrors.Add(propertyInfo.Name, string.Empty);
}

public void ClearDataErrorInfo()
{
foreach (var property in _propertyErrors.Keys)
_propertyErrors[property] = string.Empty;
}

public void ClearDataErrorInfo(string propertyName)
{
AssertThisHasPropertyWithName(propertyName);
_propertyErrors[propertyName] = string.Empty;
}

public void SetError(string error)
{
SetError(this.GetHashCode().ToString(CultureInfo.InvariantCulture), error);
}

public void SetError(string propertyName, string error)
{
AssertThisHasPropertyWithName(propertyName);
_propertyErrors[propertyName] = string.Format("{0}{1}{2}", _propertyErrors[propertyName]
, Environment.NewLine, error);
}

public string this[string propertyName]
{
get
{
AssertThisHasPropertyWithName(propertyName);
return _propertyErrors[propertyName];
}

}

public string Error
{
get
{
var errors = new StringBuilder();
foreach (
var propertyError in
_propertyErrors.Where(propertyError => !string.IsNullOrEmpty(propertyError.Value)))
errors.AppendLine(propertyError.Value);

return errors.ToString().Trim();
}
}

protected void AssertThisHasPropertyWithName(string propertyName)
{
if (!_propertyErrors.ContainsKey(propertyName))
{
throw new ArgumentException(string.Format("No property named {0} on {1}."
, propertyName, this.GetType().FullName));
}
}

#endregion
}
[/csharp]

Note that there is no validation here, only reporting if the object has errors. Using this takes advantage of the already existing validation notification built into WPF as well as WinForms.

Friday, January 09, 2009

Y2K38

Eight seconds past 3:14 a.m., on January 19, 2038, most computers in the world will think it’s actually a quarter to 9 p.m. on December 13, 1901.

Oh Crap.

UPDATE: At 11:31:30 pm UTC on Feb 13, 2009, Unix time will reach 1,234,567,890.