16 Oct 2008

Scrum for Team System Version 2.2 support for SQL Server 2008

As I have mention in a previous blog entry I’ve been heavily involved in the Scrum for Team System process template since the start of version 2 and have written the majority of the reports. I have faced, and still do, some challenges while doing this. Some of them have been around how to display the data within reporting services 2005.

Now with the release of Team Foundation Server 2008 Service Pack 1, release in August, this enables Team Foundation Server 2008 to be used on a SQL Server 2008 environment. This will make some of reporting challenges I faced with reporting services 2005 disappear. This will due to major changes that been made to reporting services 2008 like Tablix control and wider Dundas chart controls.

Within this release we have made a patch which will update some of the reports for reporting services 2008. This to fix some of the reporting services 2005 Patterns and Recipes which became visible. I am currently working looking at how Scrum for Team System can make more use of the reporting services 2008 which we will release as an advance reporting pack in the future.

To find out more about the Scrum for Team System SQL Server 2008 patcher please read the following Scrum for Team System blog

New Reporting Features available in Scrum for Team System v2.2

Build Reports added to Scrum for Team System v2.2

The Scrum for Team System report slide show tool

Scrum for Team System Version 2.2 support for SQL Server 2008

14 Oct 2008

The Scrum for Team System report slide show tool

As I have mention in another blog posts; I’ve been heavily involved in the Scrum for Team System process template since the start of version 2 and to coincide with the release of version 2.2; I would like bring your attention to a new tool, which I help to developed, that has been added to this release.

I started creating this tool due to some feedback we got from our beta users for “TaskBoard for Team system” tool. The request was to add a demonstration mode into the application which would show things like swimming lane-view and the following reports: Sprint burndown chart; Sprint cumulative flow as these can be view within the application.

The reason for the request was create the sense of: "Look guys, this is what we are working on, and this is how we are doing", throughout the day, not only during the scrum meetings. This also happen a lot within our projects which they do something similar as well. Another reason for the tool to separate from the Task Board application was to avoid polluting what the Task Board application was suppose to do.

So a new page was added to ScrumforTeamSystem virtual folder called: ReportSlideShow. This will allow a selection of team system project(s) reports to be display by their parameters default values. However the reports don’t have to be scrum for team system only reports there is one caveat: all reports have to exist in all requested display projects.

The report that are which are display is control by configuration file. To see how to figure which reports are display please use the Report Slide User Guide Post from the Scrum for Team System site

New Reporting Features available in Scrum for Team System v2.2

Build Reports added to Scrum for Team System v2.2

The Scrum for Team System report slide show tool

Scrum for Team System Version 2.2 support for SQL Server 2008

Build Reports added to Scrum for Team System v2.2

Continuing from my previous posted. I’d like to give an overview to the new engineering practices reports included in the Release 2.2.


Builds

Across a user selected time period (defaulting to the current sprint), this report shows the total builds per day broken down by status.


Build Static Analysis and Compile

This displays the number of Errors and Warnings either compile or static analysis the last N number of builds.

Compile


Build Unit Tests

Shows the total number of units test run from the last N number of builds. The total is broken down by number status of unit tests for each build.

UnitTest


Last Build Unit Test Results

This report displays the list of each test that was run for the last build and shows the result of that build. If the unit test failed it will also display the fail message as well. The last build is taken to be the last build within the data warehouse.
lastunittest


Code Coverage

This report shows the code coverage of and the total of code churn for last N number of builds. The coverage units can be selected between blocks, which is the visual studio default, lines or partial lines.

codecoverage


Build Code Churn

This report shows the total “Code Churn Counts” of the last N number of builds broken down by Added, Modified and Deleted lines. This report also features a drill down option which links to the Build Files reports.

CodeChurn


Build Files

This report, which is also a drill down for the Build Code Churn report, will display the files which were in the build or between two builds. The files are grouped by the folder in which they are located within source control and display the number of Added, Modified and Deleted lines for each of the change sets. Also included is the following change set information: Check-in by, Date, Policy Override Comment.

BuildFiles


Build Quality

This report is a based on the MSF for Agile Software Development Quality Indicators. This has been added by popular request and inspired by a blog post by Ed Blankenship: (http://blogs.infragistics.com/blogs/eblankenship/archive/2007/06/18/msf-agile-quality-indicators-report-for-conchango-scrum-process-template.aspx).

BuildQuality


Build Duration

This shows how long the last N builds took to complete.

Duration


Build League

This will show a League table of the Top N number of people on their total number builds of Fail or Successful Builds (using either an absolute number or percentage base). This report will run for a selected period defaulting to the currently running sprint. A successful build is defined as any build which not in a state of failed. (Which is not the same as a successful build.)

League

There is one caveat: The way in which a build is counted against a person is if their change set is linked to a build. So some people will gain some builds fails even if their change set is actually not the reason why the build failed.

All the reports will have the option to select which build type to run against. Also where the reports allow an N number of last builds they have been limited to the following values 10,15,20,25 or 30.

These reports are not actually link to the Scrum methodology but are helpful tools in the engineering practices. There will be a separate forum for issues and question which can be found here: Engineering Practices – v2.2 Reports

New Reporting Features available in Scrum for Team System v2.2

Build Reports added to Scrum for Team System v2.2

The Scrum for Team System report slide show tool

Scrum for Team System Version 2.2 support for SQL Server 2008

New Reporting Features available in Scrum for Team System v2.2

To coincide with the release of version 2.2 of Scrum for Team System, I’d like to bring your attention to the wide range of new reports included in this release. I’ve been heavily involved in the Scrum for Team System process template since the start of version 2 and have written the majority of the new reports and tweaked some of the old ones to improve performance by moving some of the querying to the TFS cube, as well as reflecting user feedback.

One such fix, which can be found in release, was to the burn down report which would show a misleading (but accurate) trend line if the task data wasn’t updated on the very first day of the sprint. (This could also be fixed by making your Scrum Master work harder, but I don’t have any control over that so I decided to tweak the report instead).

Most of the new reports in this release are focussed on engineering practices, which I will cover another blog post; however there is one new report specifically targeted for Scrum called the velocity report.
Velocity is based upon the recorded historical performance of a team, and is a fantastic tool for accurate release planning; for more details please see my colleague Simon Bennett’s blog for more details.

Velocity

So onto the new reports....

New Reporting Features available in Scrum for Team System v2.2

Build Reports added to Scrum for Team System v2.2

The Scrum for Team System report slide show tool

Scrum for Team System Version 2.2 support for SQL Server 2008

30 Sep 2008

Using NDepend to help guide Refactoring

In my other blogs entries I mention that I have been looking into building a Team Foundation Server Data Warehouse Adapter.  After I got my initial proof of concept version working, I started to extend it to get a list of available builds from TFS and then import the output of Source Monitor – a popular free code metrics tool.

Now I must state that I am not a natural .NET coder. My normal day-to-day work is based around the SQL Server technology stack. I do have some .NET coding skills but generally just enough for what is need in and around SQL Server. So I asked my colleague, Howard van Rooijen who is one of our top .NET coders, to have a look at my code and give me some hints.

This provide me with a great opportunity to learn from Howard about some more advance Design Patterns,  Visual Studio tips and tricks, and how to effectively use some tools I’ve not used before to help me write better code. The first thing Howard did was install the following on my machine:
  • ReSharper – to help me refactor the code, fortunately we have just rolled out ReSharper licenses to all .NET Developers!
  • StyleCop – to help remind me of the Microsoft C# Coding Conventions
  • TDD.NET  to ease the pain of running my unit tests
  • NDepend – a code metrics tool
Howard wanted to integrate NDepend into my TFS Data Warehouse Adapter and suggested that the best way to understand the tool would be to use it to analyse my TFS Adapter code and see if it would be able to give the direction on where to focus our refactoring efforts.   

NDepend  comes with 82 code metrics out of the box, but it’s far more than a standard code metrics tool. It comes with a bundle of best practice design rules which are written using Code Query Language – an idea so simple and elegant that I bet the FxCop team are kicking themselves for not thinking of it first. CQL allows you to write statements like:
// Methods poorly commented
WARN IF Count > 0 IN SELECT METHODS WHERE PercentageComment < 20 AND NbLinesOfCode > 15  ORDER BY PercentageComment ASC

You can even embed CQL directly into your code via attributes. NDepend also has features like being able to compare different builds (Patrick Smacchia, NDepend’s developer demonstrated this feature on the recent .NET Framework 3.5 SP1 release), manage complexity, dependencies and monitor the health of your build process.

We started by running my code through NDepend and the initial results were that my code has high instability and not at all abstracted.  I couldn’t really affect the instability as my code was just a thin wrapper around the TFS API.  It also highlight that some of my old best practices that needed updating as many of them are now redundant with C# 2.0 features (generics, static classes etc). It also highlighted performance gains I could make.

Howard said that he was having some difficulties understanding what the intention of my code was, as he was unfamiliar with building Data Warehouse Adapters. He asked me if I could write some unit tests that would call the code and verify the output – so he could use the “Test with... Debugger” feature of TDD.NET to see what was going on with the code.

Once I had written some tests, Howard wrote two more: one that passed in a null value and another that passed in an empty string; these, he explained were two valid edge cases for handling bad data and to my surprise both tests failed! Once I fixed both failing tests, he asked me to run the whole test suite again and this time I was really surprised as all of my previously passing tests now failed! The issue was with two helper methods that convert string based representations of number within the Data Warehouse into valid number primitives.

NDepend pointed out lots of issues with these methods and Howard said that a simple Regular Expression could replace a lot of the parsing logic I had written, so I set about refactoring the code.

Before:
   1:  public static string CleanInt(string number) 
   2:  { 
   3:   
   4:      string result = string.Empty; 
   5:      bool foundNonDigit = false; 
   6:      int index = 0; 
   7:   
   8:      if (number == string.Empty) 
   9:      { 
  10:          return "0";
  11:      } 
  12:   
  13:      while (!foundNonDigit) 
  14:      { 
  15:          if (char.IsDigit(number[index])) 
  16:          { 
  17:              result = string.Format("{0}{1}", result, number[index]); 
  18:              index++; 
  19:          } 
  20:          else 
  21:          { 
  22:              foundNonDigit = result.Length > 0; 
  23:          } 
  24:      } 
  25:      return result; 
  26:  } 
  27:   
  28:  public static string CleanFloat(string number) 
  29:  { 
  30:      string result = string.Empty; 
  31:      bool foundNonDigit = false; 
  32:      bool foundDecmialPoint = false; 
  33:      int index = 0; 
  34:   
  35:      if (number == string.Empty) 
  36:      { 
  37:          return "0"; 
  38:      } 
  39:      while (!foundNonDigit) 
  40:      { 
  41:          if (char.IsDigit(number[index]) || (number[index] == '.' && !foundDecmialPoint)) 
  42:          { 
  43:              result = string.Format("{0}{1}", result, number[index]); 
  44:              foundDecmialPoint = number[index] == '.'; 
  45:              index++; 
  46:          } 
  47:          else 
  48:          { 
  49:              foundNonDigit = result.Length > 0; 
  50:          } 
  51:      } 
  52:      return result; 
  53:  }
  54:   

After:
   1:  public static string CleanInt(string number) 
   2:  { 
   3:      int resultFromTryParse; 
   4:      // check for empty string after cleaning string 
   5:      if (number.Trim() == string.Empty) 
   6:      { 
   7:          return "0"; 
   8:      } 
   9:      // if try parse works return the number in string format 
  10:      if (int.TryParse(number, out resultFromTryParse)) 
  11:      { 
  12:          return resultFromTryParse.ToString(); 
  13:      } 
  14:   
  15:      // else return results of clean GetCleanNumberString 
  16:      return GetCleanNumberString(number, "Integers"); 
  17:  } 
  18:   
  19:  public static string CleanFloat(string number) 
  20:  { 
  21:      float resultFromTryParse; 
  22:      // clean the number and check to see if it's empty 
  23:      number = number.Trim(); 
  24:      if (number == string.Empty) 
  25:      { 
  26:          return "0"; 
  27:      } 
  28:      // try parse the number if works the return the number 
  29:      if (float.TryParse(number, out resultFromTryParse)) 
  30:      { 
  31:          return number; 
  32:      } 
  33:      // returns result of Get clean number 
  34:      return GetCleanNumberString(number, "Floats"); 
  35:  } 
  36:   
  37:  private static string GetCleanNumberString(string number, string numberType) 
  38:  { 
  39:      // Create a regualr patten with name groups do a macth on the number entered 
  40:      Regex regex = new Regex(@"(?<Floats>(?<Integers>\d+)+\.\d+)|(?<Integers>\d+)", RegexOptions.Singleline); 
  41:      Match match = regex.Match(number); 
  42:      string results = string.Empty;  
  43:   
  44:      // if match groups are found the return the string 
  45:      if (match.Groups != null) 
  46:      { 
  47:          results = match.Groups[numberType].Value; 
  48:      } 
  49:      return results; 
  50:  }
  51:   

Now all my tests passed and NDepend was happy too.

Next NDepend highlighted some of my methods that had too many lines of code. Howard said he was having a bit of difficulty understanding what the code was doing and explained to me the notion of Programming by Intention.

Next he paired with me and went through the method reformatting the code. He said that by reformatting, reorganising and removing redundant variables you start to see patterns forming within the code, that clumps of logic naturally come together and how you can then use ReSharper’s Extract Method refactoring to extract that clump of code into a separate method. Now instead of having a mass of hard to understand code – the method simply listed a series of steps that occurred – the intention of the method had been made clear. Howard was happy because he now understood what the method was doing and NDepend was happy as the method was now shorter and a great deal less complex.

In my original proof of concept I had used partial classes as a mechanism to extend the main TFS Data Warehouse Adapter and implemented another partial class in order to integrate Source Monitor.  Howard pointed out that this approach wouldn’t work if we wanted to integrate a second tool.

He said that now we had cleaned up the code and broken down the few large methods into a series of smaller methods we had surfaced the behaviour required to integrate an external code metrics tool into the Data Warehouse Adapter and that we could use ReSharper’s Extract Interface refactoring to move the definitions of this behaviour into an interface called ICodeMetricAdapter and then use this interface to implement any other 3rd party code metrics tool.

Howard said that we had now essentially implemented the adapter pattern. Next we refactored the existing code to use the interface instead of the concrete type and we extracted the Source Monitor code into a new file called SourceMonitorAdapter which implemented the ICodeMetricAdapter interface. We added a List<ICodeMetricAdapter> to the parent object and then refactored the code to loop through this list and then call the current ICodeMetricAdapter.

We managed to get this all done in one afternoon, just as I was packing up Howard asked me “Now, what do you need to do to add an NDepend Adapter?” I created a new class called NDependAdapter and made it implement ICodeMetricAdapter. Howard showed me that I could use the Implement Members quick fix by selecting the interface and hitting ALT-ENTER. Then all I had to do was add a new instance of the NDependAdapter to the List<ICodeMetricAdapter> and the new NDependAdapter is automatically called!

Below is the before and after class diagrams of the solution:

Before:

TFSWarehouseAdpterBefore

After:

TFSWarehouseAdpterAfter

11 Jul 2008

How I built a Team Foundation Server custom data warehouse adapter

As mentioned in a previous blog I would like to explain some of the steps I took to get my custom data warehouse adapter to work.  I am going to start from the beginning just so that I have an easy starting point.  The code I show here was based upon my POC and written in C#.
  1. 1. Create a new C# Class Library project in Visual Studio. Then add references at least to the following assemblies: Microsoft.TeamFoundation.dll, Microsoft.TeamFoundation.Client.dll, Microsoft.TeamFoundation.Common.dll, Microsoft.TeamFoundation.Warehouse.dll and System.Web. As I need to connect to the build server I added Microsoft.TeamFoundation.Build.Common.dll and Microsoft.TeamFoundation.Build.Client.dll
  2. 2. Then specify the use of the IWarehouseAdapter interface.
  3. 3. Implement IWarehouseAdapter.RequestStop which only sets a stop flag to true which can then be tested by the other methods periodically.
  4. 4. Implement the IWarehouseAdapter.Initialize. This method this where I need to store objects that will need to communicate to the all systems that you adapter needs to talk to i.e.:
   1: string url;
   2:  TeamFoundationServer tfs;
   3:  _DataStore = ds;
   4:   
   5:  if (_DataStore == null)
   6:  {
   7:     throw new Exception("Null data store.");
   8:  } 
   9:   
  10:  url = Microsoft.TeamFoundation.Server.TeamFoundationApplication.TfsNameUrl;
  11:  tfs = TeamFoundationServerFactory.GetServer(url);
  12:   
  13:  if (tfs == null)
  14:  {
  15:     throw new Exception("TF Server instance not obtained for TFS url: " + url);
  16:  } 
  17:   
  18:  _BuildServer = (IBuildServer)tfs.GetService(typeof(IBuildServer)); 
  19:   
  20:  if (_BuildServer == null)
  21:  {
  22:     throw new Exception("Build Server instance not obtained for TFS url: " + url);
  23:  } 
  24:   
  25:  _CommonStructureService = (ICommonStructureService)tfs.GetService(typeof(ICommonStructureService));
  26:   
  27:  if (_CommonStructureService == null)
  28:  {
  29:     throw new Exception("Common Structure Service instance not obtained for TFS url: " + url);
  30:  } 
  1. 5. Implement the IWarehouseAdapter.MakeSchemaChanges. This method is called to see if there are schema changes that would need to be made to the data warehouse.  This is recommended to be where you actually register your changes you need to make to the warehouse i.e.: 
   1:  SchemaChangesResult result = SchemaChangesResult.NoChanges; 
   2:   
   3:  //Get the current data warehouse configuration  
   4:  WarehouseConfig warehouseConfig = _DataStore.GetWarehouseConfig(); 
   5:   
   6:  //As the IWarehouseAdapter.MakeSchemaChanges can be called many times in one sessions 
   7:  //I needed to see if your fact was already created 
   8:  Fact MyNewFact = warehouseConfig.GetFact("MyNewFact");
   9:  if (MyNewFact == null)
  10:  { 
  11:      //Create My New Fact
  12:      MyNewFact = new Fact(); 
  13:      MyNewFact.Name = "MyNewFact";
  14:   
  15:        //Adding a link to the Team Project – I had to do this otherwise my save would not work
  16:      MyNewFact.DimensionUses.Add(CreateDimensionUse("Team Project", "Team Project"));
  17:   
  18:      //Adding a link to the Build dimension also when use the dimension I needed to 
  19:        //find out the dimension key attribute was for when saving my data. 
  20:      MyNewFact.DimensionUses.Add(CreateDimensionUse("Build", "Build"));
  21:      MyNewFact.Fields.Add(CreateField("measure1","int",0,"Sum"));
  22:      MyNewFact.Fields.Add(CreateField("measure2", "int",  0, "Sum"));
  23:   
  24:      //Setting which perspective it would show under if enterprise edition of AS was in use.
  25:      MyNewFact.PerspectiveName = "Code Churn";
  26:      MyNewFact.IncludeCountMeasure = true; 
  27:   
  28:      if (!_StopRequest)
  29:        {
  30:             //Starting a Transaction to rollback my changes if they fail 
  31:             _DataStore.BeginTransaction();
  32:            try
  33:            {
  34:                   //Adding to the data warehouse configuration xml file and saving the changes 
  35:                   warehouseConfig.Facts.Add(MyNewFact);
  36:                   _DataStore.Add(warehouseConfig);
  37:                   _DataStore.CommitTransaction();
  38:                   result = SchemaChangesResult.ChangesComplete;
  39:             }
  40:             catch
  41:             {
  42:                   _DataStore.RollbackTransaction();
  43:                   throw;
  44:             }
  45:      } 
  46:      else
  47:      {
  48:          result = SchemaChangesResult.StopRequested; 
  49:      }
  50:   
  51:  }
  52:  return result; 
  53:  //The functions to create a new field and dimension link
  54:   
  55:  private Field CreateField(string FieldName, string FieldType, short FieldLength, string FieldAggregationFunction) 
  56:  {
  57:   
  58:      Field newField = new Field();
  59:      newField.Name = FieldName;
  60:      newField.Type = FieldType;
  61:      newField.Length = FieldLength;
  62:      newField.AggregationFunction = FieldAggregationFunction;
  63:      return newField;
  64:   
  65:  }
  66:   
  67:  private DimensionUse CreateDimensionUse(string DimensionName, string UseName) 
  68:  {
  69:   
  70:      DimensionUse dimensionUse = new DimensionUse();
  71:      dimensionUse.DimensionName = DimensionName;
  72:      dimensionUse.UseName = UseName;
  73:      return dimensionUse;
  74:  }
  1. 6. IWarehouseAdapter.MakeDataChanges which is where all the transferring and transforming of the data takes place.
   1:  IEnumerator projectEnum = _CommonStructureService.ListAllProjects().GetEnumerator();
   2:  DataChangesResult result = DataChangesResult.NoChanges;
   3:  IEnumerator buildEnum;
   4:  IBuildDetail[] buildDetails;
   5:  ProjectInfo currentTeamProject;
   6:  DateTime buildStartSearch = LastMyNewFactPrcoessedDateTime;
   7:  DateTime buildEndSearch = DateTime.Now; 
   8:   
   9:  while (projectEnum.MoveNext())
  10:  { 
  11:      //checking to see if a stop request has happend if so then return the fact that a stop happen.
  12:   
  13:      if (_StopRequest
  14:      {
  15:          return DataChangesResult.StopRequested;
  16:   
  17:      } 
  18:   
  19:      //Just checking the state of the project as I am not interested in deleted projects
  20:   
  21:      currentTeamProject = projectEnum.Current as ProjectInfo;
  22:      if (currentTeamProject.Status != ProjectState.Deleting)
  23:      {
  24:   
  25:          //Get a list of builds and the filter the out builds that I have already processed.
  26:          buildDetails = _BuildServer.QueryBuilds(currentTeamProject.Name);
  27:          var filterBuildList = from bd in buildDetails
  28:          where bd.StartTime >= buildStartSearch
  29:          && bd.FinishTime < buildEndSearch
  30:          select bd;
  31:   
  32:          //From my new list of builds save my fact details
  33:          buildEnum = filterBuildList.GetEnumerator();
  34:   
  35:          while (buildEnum.MoveNext())
  36:          {
  37:   
  38:              if (_StopRequest)
  39:              {
  40:                  return DataChangesResult.StopRequested;
  41:              }
  42:              SaveMyNewFactEntry (buildEnum.Current as IBuildDetail, currentTeamProject); 
  43:          
  44:              //Update the last build started and date I have just covered.
  45:              LastMyNewFactPrcoessedDateTime = buildEndSearch;
  46:              result = DataChangesResult.ChangesComplete;
  47:          }
  48:   
  49:      }
  50:   
  51:  }
  52:   
  53:   
  54:  private void SaveMyNewFactEntry (IBuildDetail buildDetail, ProjectInfo currentProject)
  55:  {
  56:      Random rand = new Random();
  57:   
  58:      //Check to see if this build has been already added to fact table.
  59:      string buildURI = LinkingUtilities.DecodeUri(buildDetail.Uri.AbsoluteUri).ToolSpecificId;
  60:      if (_DataStore.GetFactEntry("Build Details", buildURI) != null)
  61:      { 
  62:          FactEntry newFactEntry = _DataStore.CreateFactEntry("MyNewFact "); 
  63:          // Had to add Tracking ID by not doing so cause an error.  The Tracking ID is used to find facts 
  64:          //So need make sure that method which repeatable for the same fact entry each time. 
  65:          newFactEntry.TrackingId = Guid.NewGuid().ToString("D", CultureInfo.InvariantCulture);
  66:          newFactEntry ["Team Project"] = LinkingUtilities.DecodeUri(currentProject.Uri).ToolSpecificId;
  67:          newFactEntry ["Build"] = buildURI; 
  68:          newFactEntry ["measure1"] = rand.Next(0,100);
  69:          newFactEntry ["measure2"] = rand.Next(0, 100);
  70:   
  71:          //Again create a transaction save the entry if successful commit the changes
  72:          _DataStore.BeginTransaction();
  73:          try
  74:          { 
  75:              _DataStore.SaveFactEntry(newFact, true);
  76:              _DataStore.CommitTransaction();
  77:          }
  78:          catch
  79:          { 
  80:              _DataStore.RollbackTransaction();
  81:              throw;
  82:          }
  83:   
  84:      }
  85:   
  86:  } 
  • 7. I need to store the last date processed so I created a property which stores the data in the datastore property bag.
   1:  // Last changeset that was populated.
   2:  private DateTime LastMyNewFactPrcoessedDateTime
   3:  { 
   4:      get
   5:      { 
   6:          String LastMyNewFactPrcoessedDateTimeStr = _DataStore.GetProperty("Last MyNewFact Prcoessed DateTime");
   7:          DateTime lastMyNewFactPrcoessedDateTime = DateTime.MinValue;
   8:          if (!String.IsNullOrEmpty(LastMyNewFactPrcoessedDateTimeStr))
   9:          { 
  10:              lastMyNewFactPrcoessedDateTime = DateTime.Parse(LastMyNewFactPrcoessedDateTimeStr);
  11:          }
  12:          return lastMyNewFactPrcoessedDateTime;
  13:   
  14:      }
  15:   
  16:      set
  17:      { 
  18:          _DataStore.BeginTransaction();
  19:          try
  20:          { 
  21:              _DataStore.SetProperty("Last MyNewFact Prcoessed DateTime", value.ToString());
  22:              _DataStore.CommitTransaction();
  23:   
  24:          }
  25:          catch
  26:          {
  27:              _DataStore.RollbackTransaction();
  28:              throw;
  29:          }
  30:      }
  31:  }
  1. 8. Build the adapter as a DLL.
  2. 9. Copy the build DLL into the warehouse plugins folder on the application tier.  In most cases will be C:\Program Files\Microsoft Visual Studio 2008 Team Foundation Server\Web Services\Warehouse\bin\Plugins.
  3. 10. Reset IIS.
  4. 11. On the application tier, navigate to http://localhost:8080/Warehouse/v1.0/warehousecontroller.asmx?op=Run and click Invoke.
  5. 12. On the application tier, navigate to http://localhost:8080/Warehouse/v1.0/warehousecontroller.asmx?op=GetWarehouseStatus and click Invoke. Continue doing this until the status is returned as Idle. Check the application event log, to see if an exception has been thrown.
  6. 13. Reset IIS.

If you wish to find out what your current warehouse configuration is then run the following query against your TFS warehouse:

SELECT CAST(wc.Setting AS XML) AS setting FROM dbo.[_WarehouseConfig] AS wc WHERE wc.ID = 'ConfigXML'

Building a Team Foundation Server custom data warehouse adapter

I have been investigating a way of adding more metrics about our builds into the Team Foundation Server 2008 warehouse.  This would then allow us to measure quality of our code over time by other metrics than the just method offered by the Team Foundation Server 2008.

So my goal was to create a new fact table and link this to build dimension.  The reason for building a fact table was the metrics as I would want to do some aggregation against them; this would mean I need to treat them as measures and measures only live in a fact table.  I also wanted a separate fact table so my adapter doesn’t impact the standard TFS adapters. 
As my facts would be generated by builds it would mean that facts would link to the build which why I used the build dimension.  Also I found out later I had to also include the Team Project dimension as well.

I managed to find some useful links that helped me write my adapter:

http://msdn.microsoft.com/en-us/library/bb130342.aspx - This section covers several topics: how to implement an adapter; how the data warehouse is created and gives example on how to create adapter.  The example is only how to extend existing dimensions not how to create new objects.

http://tomfury.spaces.live.com/blog/ - This guy was trying to do the same thing as me and created a series of blog entries explaining what he was doing and his issues he was facing and gave some code examples.  Also he has posted the following questions on the forums which were helpful:

http://forums.microsoft.com/MSDN/showpost.aspx?postid=2464644&siteid=1
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1644099&SiteID=1
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2518423&SiteID=1

There some has been some changes, as I understand, between 2005 and 2008. As I no longer code against TFS 2005 I can’t make any real comment about these.  However I found these links helpful for coding against TFS 2008:

http://ozgrant.com/2006/06/18/how-to-list-tfs-team-projects-using-the-api/ - this to gain information about the projects that are in TFS server.

http://notsosmartbuilder.blogspot.com/2007/09/team-build-2008-api.html - this helped me to query TFS server about the builds it understands.  As I can see there has been a lot of changes between the 2005 and 2008.  In the 2008 API some namespaces have been made obsolete (Microsoft.TeamFoundation.Build.Proxy; also a few classes in Microsoft.TeamFoundation.Build.Common).

To see how I wrote my custom TFS data warehouse adapter please click here.

10 Jun 2008

Could HTTP Soap / T-SQL endpoint be replaced by Reporting Services XML Reports?

I have been working on a POC using SQL Server 2008 and Google Maps.  The POC was to use Geo Spatial technology to illustrate our Geo Spatial data easily.   I will talk about this in more detail in another blog post as I would like talk about a problem I faced while creating this POC. 

The problem I faced was how could I get my Geo Spatial data, which was stored within SQL Server 2008 using the new geography data type, into  "static" HTML?

So I thought that SQL Server HTTP Endpoints for SQL would be great thing to use to solve this problem.   As all I was after was xml document to stream over HTTP.   There was no business logic needed so all the .Net would be is just another layer acting as go between.  Also I didn’t what to waste time on what code which wouldn’t have added any value.

So I created my stored procedure to produce an XML document that I found usable by using the For XML Path.  Then I started to look at how to create a HTTP endpoint for SOAP. After working through this I was presented with this message from my SQL Server 2008: "Creating and altering SOAP endpoints will be removed in a future version of SQL Server. Avoid using this feature in new development work, and plan to modify applications that currently use it."

It was nice to see that Microsoft are giving people plenty or warning about breaking changes for next version of SQL server.  It’s a surprise that a feature is being dropped so quick, second only to Notification Services since I have known the product, but I can understand the decision as a lot of questions on how, why  &  should it be used were ask by some of our SQL Server developers.   So I took notice of Microsoft warning and tried to see if there was another method I could get my data into an XML document and streamed over HTTP without using .Net code and web services.

All I was doing with this data was just reading for display via another component.  The data was in a structured format.   Then I thought about Reporting Services, as you can write a report and get the reporting server to render to XML.   So I quickly wrote a report to get the outputted xml into the format that I wanted.  Then in my static html I changed my http request to point to my report server with added option to render to XML.

This approach did work and was great, as I managed to produce a page that could request data XML data and render data without any need to write .Net code.  The static html page would always remain up to date.  Another bonus was I had the report ready to show the data in a table from without having to re-write or reduplicating SQL code.  I was also then able to extend this to accept parameters so I could order the data base upon some user data from the HTML input control.

Now this approach is not always going to work for every situation.  One reason is that reporting services security will not allow anonymous access.  So when using a windows security the report server would need to be on the same server.  A method to overcome this would be to create some forms authentication method on the report server and get your site to create the form cookie automatically.

Another potential problem is way that Reporting Services sends the XML data other HTTP.  The XML data is actually sent as an attachment file on the HTTP response.  Now some applications, like the side bar gadgets, might not be allowed to, or can’t, handle http attachments.

19 Feb 2008

How to E-mail the Sprint Burndown Chart to yourself or your team.

I have been working on Scrum for Team System focusing on the reporting functionality of this tool.  One of the things I looked at was to get the reports delivered to me by e-mail.  I went looking to see if there was an interface via the Team Explorer interface, to which I found quickly there wasn’t.  However because TFS uses Reporting Services I knew that I could use Reporting Services subscriptions functionality. 
Firstly, if not already configured, the Reporting Services server will need to will need to be set up to point to an SMTP server.  This can be done by using the Reporting Services configuration tool and updating the e-mail settings within this tool.  
You can find this tool on your TFS server under the following: Start Menu > All Programs > Microsoft SQL Server 2005 > Configuration Tools
RS2005ConfigureEmail
To create the subscription you need to use the Report Manager  (to gain access to this you can right mouse click on the report folder within the team explorer):
TeamExplorer2008ShowReportSite
When the report manager is loaded then the root folder will contains folders based upon the name of Team  Projects. Click on the folder that contains the required reports. After the list of reports have loaded click on the Sprint Burndown report to view the report.  After the report has rendered select the subscriptions tab, once the page has loaded and then click on new Subscriptions.  Then filling the necessary details on how the report should be delivered and what report parameters the report should used when it runs.  Finally I would recommend that for the parameters select the option to pick the defaults of the reports.

How to tell when the Team Foundation Server data warehouse was last updated.

While I have been working on the Scrum for Team System reporting I came across a re-occurring issue.  The issue was not knowing when our data warehouse last updated or when the next update was due to happen. 

After some investigation, including some information from a link on one of my colleague’s blog posts, I found the answer to the issue.  In the TFS  data warehouse there is a table called: _WarehouseConfig.  This contains some useful information like when the last data warehouse was last process time, process interval and end time of each step of the process ended.  

So with this information I updated the version report for Scrum for Team System to include the following information: process interval time, next time process is due, last time the cube was processed, last time the data in data warehouse was updated.

I have attached a copy of the new report as it not exclusive to Scrum for Team System.  Also this report was written for TFS 2008.