Posts

Showing posts from 2006

MSBuild using SQL Script parameters

I have been working with MSBuild and the Microsoft.Sdc.Tasks to deploy my current project’s databases.   Some of the SQL scripts that are executed within MSBuild need parameters to be set to create the database objects, which the Microsoft.Sdc.Tasks.Sql.Execute method supports.   To use parameters within MSbuild and SQL.Execute; firstly create an item group (<ItemGroup>) then create element of any name as this will become the name of the item that can be referred to in MSbuild.   Within the new element an attribute has to set called “Include”, then create two other elements called “name” and “value”.   Here is an example: <ItemGroup>             <schemaName Include=”true”> <name>@pTableName</name> <value>’sys.Objects’</value>             </schemaName>             <schemaTable Include=”true”> <name>@pTableId</name> <value>342</value>             </ schemaTable > </ItemGroup>

To Collate or not to Collate

Collation plays an important role within the database, as it sets out how the SQL Server has to manage string data within storage & queries, yet I don’t script out the collation when scripting out my databases.   So why don’t I script out the collation if it so important? Its because if the collation is specified but not used properly within the database it can cause some of most common problems, like “Cannot resolve collation conflict for EQUAL TO operation” or “Cannot resolve collation conflict for UNION operation”. These errors are caused when string data within different collations are being compared or combined into the same result set; this is nothing new so why talk about collation now? Well it’s to do with the release of Visual Studio 2005 Team Edition for Database Professionals (VSTDB) something I think needed to be taken in consideration if this is being used to manage the database side of the project.   A cause for theses types of problem is when a table is created o

Delpoying Reporting Services Reports With MSBuild

I am currently responsible for deploying our project into the test and production environments.   The project has a few reports which need to be installed on a few report servers.   The deployment tool that I am using for the rest of my project is MSBuild so I tried to see if there was a way of getting MSBuild to deploy the reports.   The only method that I found was to write an RS script file which deployed the reports and calls the Exec task to run the RS script.   So I started outright with RS script as I was not over joyed about writing VB.Net code as I am a C# coder. Then I remembered that MSBuild is customizable as it allows custom tasks to be written.   So I stopped writing my VB.Net code went back C# and created a few tasks that are commonly needed to deploy: AddReportUser, CreateReportFolder, CreateConnectionSoruces, DeployReports and SetReportsDataSource.   The task code was easy to write. I did have to change some of the auto generated code.   I changed the refence.cs

Reporting Services Express Edition using host file redirect to connect to local database.

I had a few reports for an application that needed to be deployed to number of different report servers.   The report servers were Express edition of Reporting Services 2005 with the application database on the same server which helped with the connection limitation of Reporting Services 2005 Express. As the report servers were joined to a domain their computer names have to be unique, the only way application was going to get around this problem was to use host file redirect.   To help with deploying the report I was to going to try to use the host idea, so I added the server name to the localhost entry and tried to run the reports.   Unfortunately this would not work because it looks like the reporting services 2005 express checks the name of the computer, which it gets from the following path HKEY_LOCAL_MACHINE\SYSTEM\ControlSet001\Control\ComputerName\ActiveComputerName, against the datasource within the connection if they don’t match then raises an error.

Report Viewer Customizing Printing for Local Report

We had a requirement to allow a front page, which was another local report, to be printed before the report within the report viewer was printed.  As both reports were local reports it was easy to create a print button which allowed us to meet the requirements, here is the code that we used: 1: if ( this .printFrontingSheetCheckbox.Checked) 2: { 3: LocalReport frontsheet = front.reportViewer1.LocalReport; 4: Export(frontsheet, 8.27f, 11.69f, 0f, 0f, 0f, 0f); 5: m_currentPageIndex = 0; 6: Print( false ); 7: 8: foreach (Stream s in m_streams) 9: { 10: s.Close(); 11: } 12: front = null ; 13: } 14:   15: LocalReport report = reportViewer.LocalReport; 16: Export(report, 11.69f, 8.27f, 0f, 0f, 0f, 0f); 17: m_currentPageIndex = 0; 18:   19: Print( true ); foreach (Stream s in m_streams) 20: { 21: s.Close(); 22: } 23:   24: private void Print( bool landscape) 25: { 26: if (m_streams =

Dynamic SQL Server ordering without using dynamic SQL.

For my current project I have a stored procedure that needs to sort data by 1 of 3 columns depending upon the option chosen by the user from the application.  This is only the dynamic part of the stored procedure, which also happens to be very complicated, so I didn’t wish to use dynamic SQL Server.  I know that within SQL server 2000 you can use a case against a variable to select which column to order the data by but the only caveat is that all the columns have to be the same data type.  My problem was that not all my columns are the same data type, I managed to over come this by using the new function within SQL server 2005 call Row_Number(), which gives a row number to each row, based upon the order by used within the over function, as this always returns the same data type it can be used within the case statement for the ordering. Below is an example of code: 1: DECLARE @vOrderby AS INT 2: SET @vOrderby = 4 3:   4: SELECT TABLE_CATALOG , TABLE_SCHEMA , TABLE_NA

Reporting Services Report Viewer Control & Page Sizes

I have found that the reporting services report viewer control might not use the correct paper size to render the report.  The reason for this can be found on the following link: click here . Quick summary of the solution is that if you wish the report viewer control to render A4 page size instead of Letter then within the report define the pages size of A4 in inches.

Avery Labels Printing via RDL / Reporting Services

I have just managed to solve the problem of how to give an application the ability to print Avery labels with data from our database. I only had to support two label formats (L7161 and L7163) and the data requirement was static as well. So I looked at RDL / Reporting Services and the Report Viewer control. The way I wrote the report is as follows: Set the page size to A4 and all the margins to 0cm. Made sure that the report was not using a Header or Footer. Placed one list control per column of labels on a sheet Changed the size of the list control to be the same as a single label Then set an expression on the list visibility based on the result of the rownumber modulus by the number of columns (making sure each column looked for a unique result) Then added the textboxes for the data making sure there was a gap between the top and bottom of the list control. Also made sure that the text boxes didn’t push the list box size out. I did have trouble trying to line the labels up 10

SQL Server 2005 Merge Replication Subscriber unexpectedly deleting during data Synchronisation.

Within my current project I have been using SQL Server 2005 merge replication.  While I have been testing some issues have been raised by our testers and I would like to write about another one of these in particular. Data, which was entered at the subscriber, was being removed while synchronizing with the publisher.  This can happen when the data entered isn’t part of the partition for a given subscriber.  However, on this occasion we expected the data to remain in place.  After some investigation by me and my colleague, James Rowland-Jones , we found the issue. We noted that the affected tables were organised in parent child relationships at a table / schema level but they were joined child parent in the replication topology.  They were joined via a join filter with a partition filter on the child record.  So Customers 1 -> N Orders but partitioned On Orders with a join filter back to Customers.  The topology was created in this way to help partition the data sent to the indiv

Installing SQL Server 2005 Express with Remote Connections.

I have managed to set-up a SQL Server 2005 Express instance, on a windows-XP machine, with remote connections enable. The actual installation of the SQL Server 2005 express instance was straightforward, but enabling the remote connections was problematic. I found that I was not the only person. In the end I found these steps worked for me: Open up the SQL Server Surface Area Configuration Manager Click on Surface Area configurations for services and connections. Click the remote Connections entry on the tree, select local and remote connections (TCP/IP) . Click OK. Open the SQL Server Configuration Manager Select SQL Server 2005 Network Configuration/Protocols for the instance. Right click TCP/IP, select properties. Click the IP Addresses tab, change the Enable flag on the IP addresses to YES. In the IPAll section, copy the port value of the TCP Dynamic Ports to the TCP Port. Clear the TCP Dynamic Port value. Click Okay. Open up Control Panel/Firewall. Click Exception

SSIS Creating Synchronous Transform Component

Do you need to write a custom component to transform input data, and disregard this input data, instead, replacing it? That's was what I needed to accomplish with my component. It takes data from the input buffer and performs a lookup via an external API. If there is a match, send the result data down a "match" output otherwise send the input data down a "unmatch" output. The key here is that the "shape" of the output was changing. In other words, the metadata of the output was very different from that of the input.I also want my component to be synchronous (meaning synchronous, I mean that the component takes a row, processes it and then immediately sends it down an output) and not asynchronous (meaning that it reads all the data and then processes the data in an internal buffer). Now from what I was reading and seeing this was what I thought SSIS meant by synchronous and asynchronous components, but I was wrong! An asynchronous component can either

SSIS File handle leak in For Each Loop

I am currently working on a project which needs to load over a 1000 xml files. The files are stored across 10 subfolders. I am using a foreach loop with a file enumerator, which is configured at the top of the folder structure and traverses the subfolders. This loops through the files, load the data and then moves the file to another folder. The package executes fine for a few 100 files but then hangs; this happens after a different number of processed files each time the package is run. While trying to resolve the problem we ran performance counters and noticed that the number open handles increased significantly just about the time Dtexec looked like it had hanged and DTexec also then started taking a lot of the cpu processing time. Update: I put this on the Microsoft forums and got a intresting answer back See Here ; There seems to be a memory leak with foreach loop which should be fixed in sp1.

SSIS Connection Object and Expressions

I have been writing a custom source adapter that uses a file connection within the connection manager. If I hard-code a specific file then the component works. However if I use a file connection that has an expression defined which updates the connection, for example when you have a for-each loop looping over a set files. The file connection doesn’t seem to re-evaluate expression each time you access the file connection via the code.

SSIS Variable Deadlock

I have noticed a problem with the script task within SSIS. The problem occurs when a script task encounters an error and the SSIS on-error event also has a script task that uses a variable that is being locked by the script task that raised the error. The problem seems to be that the SSIS on-error event fires and runs before the script task that raised the error has finished and released its lock on the variable. I have even tred to overcome this problem by using a try/catch block to release the variable locks but this also failed to resolve the problem.

SSIS Changes Database Settings

I have a SSIS package with a simple data flow task, which takes data from a flat file and loads it to an OLE-DB destination (which is configured to use Native OLE-DB\Native Client). As I ran the package via the command-line I noticed that SSIS made changes to the database options. One of the most important options that were being changed was the database recovery model which was modified to run under Bulk-Logged. This changing of the recovery model could cause problems for the back-up and recovery procedure if you are using incremental transaction log back-up as Bulk-Logged will not allow this to run. I can understand why SSIS makes this change because Bulk-Logged does not write any data to the transaction log for Bulk operations. For this reason it does not allow incremental log back-ups and always recommends that you do a back-up of the database when you change back to Simple or Full mode.

SSIS XML Task / Source Adapter

I have been working with SSIS using the XML Task and XML Source Adapter. I was getting an error say that an element is was not defined. When I check the XSD, which was using import / include schemas, I found the element was difined but not in the main XSD schema the adapter was pointing to but another schema which was being imported. The probelm was with the schema were the element was defined not being invaild for the XML Source Adapter, that was the real probelm, as soon as I corrected the imported schema the error was resolved. The probelm with the imported XSD schema there was some DTD define at the being of the file and XML Source Adapter does not support a mixture of DTD and XSD.

SSIS Execute SQL Task With VARCHAR(MAX)

While I was using the SSIS Execute SQL Task to return a single row result I getting the following error: [Execute SQL Task] Error: An error occurred while assigning a value to variable "RestoreCmd": "The type of the value being assigned to variable "User::RestoreCmd" differs from the current variable type. Variables may not change type during execution. Variable types are strict, except for variables of type Object!." This was happening becuase I was trying to assgin a varchar(max) result column to SSIS string variable. There are two ways to solve this problem; one is to change the SSIS variable to object and convert each time it used to a string; the second way to solve this is to cast result column to max size limit.

SSIS 2005 and Database Snapshots

I have been looking into the feasibility of using database snapshots as method of rolling back an SSIS package that fails and this is what I found: Creating a database snapshot was not a real problem, there is one issue that you need to be aware of. For each data file within the database that the snapshot is being based on there will have to be a sparse file created for it. I have created a little SQL script that automates the creation of the database snapshot. Rolling back a database to its initial state after a package has errorred is a problem. The method to revert a database back to its initial state with database snapshots is a restore operation. The restore is a full restore, which of course means there can't be any connections while the restore is happening and this can't be done on-line because it might have to take out the primary data file. This is a problem because SSIS connection manager will create connections to the database when the package starts and doesn