Posts

Showing posts from 2014

Integrating tsqlt with Team Foundation Build Services continuous build

Image
As I've mentioned in a previous blog post I have been integrating a SQL Server unit test framework: tsqlt into my project life-cycle ( click here to read more ).  Within this post I will share how I got my tsqlt unit tests executing within the continuous integration solution and the results influencing the outcome of the build. The continuous integration platform I am using is Team Foundation Build Services 2012 (TFBS). However my approach will work with any continuous integration platform which handles MS-Build and MS-Test, like Team City. Build Integration As I had created my database unit tests within SQL Server Data Tools (SSDTs) integrating them into the continuous integration was a straightforward process.  By extending the build definition to include the unit test project into the list of solution / projects which have to be built.  As I had my own solution, which was listed within the build definition, for all the database projects and other related projects my unit tes

Integrating tsqlt with SQL Server Data Tools

Image
As mentioned in a previous post I've recently been working with a SQL Server unit test framework: tsqlt. I would like to share how I managed to integrate the framework and unit test into my project life-cycle. Unit tests should be treated the same as application code, meaning they should be contained within a project under source control. Being under source control will allow the tests to be maintained and changes tracked when multiple people are helping to write tests.  Also having the unit test in a project means there is a deployment mechanism out-of-the-box. The tsqlt framework has a few requirement which are as follows CLR is enabled on the SQL Server The framework and the unit test are contained within the same database of the objects which are being tested The database instance is trustworthy As unit test are required to be contained within the same database as the objects being tested, also due to the way SQL Server Data Tools (SSDT) project validation works, th

Automated Unit Testing SQL Server Code

I have never been a fan of automated unit testing of SQL code mainly because previous frameworks / tools, which I came across, all seem to miss the point of SQL code which is either retrieving or manipulating data. Some of the frameworks would only test SQL scalar functions which limited code coverage. Others would test stored procedures but only the output parameter not the data being returned which limited the usefulness of the test.  The methods used to determine if a test pass or fail were also limited. As some would only check the return value; the correct number of rows were returned; a single row of data appeared in the dataset or within a row and column a value is returned. This meant that you could write code that would satisfy the test and could still be functionally incorrect as the rest of the data not checked could be inaccurate. Another problem that some of the previous frameworks forced upon you to solve was how to make the test repeatable. As data has state which is

Configuring database files within Microsoft SQL Server Development Tools

Image
Over the years SQL Server database development has evolved from non-structured tools (Query Analyser/Management Studio) to fully structured development tools (Visual Studio Database Professionals/SQL Server Data Tools). However there many important aspects of database development which hasn't changed over the years. One of which I would like to cover within this blog post is the database files configuration especially within Microsoft development tools: SSDTS and VSDB-Pro. Getting your database files size and growth incorrectly configured can hurt your database performance. As on a project we had our first test release into production of our data warehouse and run our ETL process. At the time the databases had to take on the default file settings of the model database. After the process completed within three hours our data base files all grew in excess of 40GB. We did another test release, after tearing down the environment, with database file settings worked into the deployment

Waking up Reporting Services with scripting

I have recently deployed a new reporting solution using SQL Server 2008 Reporting Services. I wanted to look into a solution to ensure that the first user, after reporting services had spun down its’ application pools, doesn't have to wait for them to be re-initiated. This so that I could avoid complaints about reporting services performance. I was looking for a simple solution to either stop reporting services clearing down the application pools or to find a way of speeding up the first request. I decided on an approach to speed up the first request made by a user. This was because I didn't want to make application settings changes in case other problems occurred. After searching for a method into speeding up reporting services application pools start-up I came across a useful PowerShell script. The script, which was for PowerShell V2, made use of the .Net class System.Net.WebRequest to make a request to the report manager. I made a copy of this script and applied some