Database Edition Custom Rule

One of the coolest feature of Visual Studio Database Edition is the ability to run Rules against a database project, like fxcop against a c# project. And the coolest part about this feature is that is really simple to write a custom rule.

Rules are important, I’ve seen project where there is no naming rule for objects in database, and so you will end with tables with column like, ablr_xxxx or xxxxx_xxx_Authorname etc etc, and the whole database looses consistency over time. Thanks to database edition I can create a custom rule that force all developers to use for example a standard naming scheme for the column.

Creating a rule is really simple, first of all create a c# project strongly signed, then add a class like this one.

using System;
using System.Collections.Generic;
using Microsoft.Data.Schema.Extensibility;
using Microsoft.Data.Schema.SchemaModel;
using Microsoft.Data.Schema.SchemaModel.Abstract;
using Microsoft.Data.Schema.ScriptDom.Sql;
using Microsoft.Data.Schema.Sql.SchemaModel.SqlServer;
using Microsoft.Data.Schema.Sql.SqlDsp;
using Microsoft.Data.Schema.StaticCodeAnalysis;

internal class MyStylePrefixedTable : Rule

To give visual studio full details about your new rule, simply use the right constructor.

public MyStylePrefixedTable()
: base(
     "All columns must be in form xxxx_xxxxxxxx",
     "All columns must be in form xxxx_xxxxxxxx where the part befor the _ is table code, and part after is descriptivename",

As you can verify is really simple to define details of the rule, just give a the namespace where the rule resides then a unique code, name and description and some other information like help location.

A rule is a simple class that inherits from Microsoft.Data.Schema.StaticCodeAnalysis.Rule, then you simply need to specify what kind of object you want to check.

public override bool IsSupported(IModelElement element)
    return element is IColumn;

This specific rule has the purpose to enforce a specific naming of all columns of database. Each column must in fact contain 4 letters (table code) then an underscore and then the name of the column. If we have the table Customer the id column will be cust_id. This naming scheme could not be the best one, but I want it to be enforced into the database, because I really dislike that each table has is own naming scheme. All the check is done in the Analyze method

public override IList<Problem> Analyze(IModelElement modelElement, RuleSetting ruleSetting, RuleExecutionContext context)
    List<Problem> problems = new List<Problem>();
    if (modelElement.Name == null)
        return null;
    IList<string> nameParts = modelElement.Name.Parts;
    IColumn column = modelElement as IColumn;

    if (nameParts.Count == 3)
        if (!Regex.Match(nameParts[2], @"\w{4}_\w*").Success)
            string message = string.Format(
                "Column {0} of table {1}.{2} must have the form pref_columnname.",
                 nameParts[2], nameParts[0], nameParts[1]);

            Problem p = new Problem(this, message, (IModelElement)modelElement);

            p.Severity = ProblemSeverity.Error;

    return problems;

This code is really simple, the important part is that each object has a name subdivided in parts, a column has three parts: schema, table and column name, so I check the last part (column name) and I validate against a regular expression. If the validation fails, I simply create a Problem object. The Analyze method must in fact return a list of Problem object, each one describing a single and unique problem of the object being analyzed.

Problem object have several properties that can be used, but the most important one is the Severity, that is needed to specify if this problem is a warning or an Error. Since I want that my convention is enforced for every table, I set Problem to Error Severity.

Once you have this class ready you need to create an xml file that contains data about the rule

<?xml version="1.0" encoding="utf-8" ?>
  xsi:schemaLocation="urn:Microsoft.Data.Schema.Extensions Microsoft.Data.Schema.Extensions.xsd">

    <extension type="SqlRule.MyStylePrefixedTable"
                assembly ="SqlRule, Version=, Culture=neutral, PublicKeyToken=43c7c037e2f19384"

You must sign your assembly, and specify full qualified name in the xml, then you need to copy this xml file, along with the assembly that contains the rule, into the directory program files\Microsoft Visual Studio 9.0\VSTSDB\Extensions. Now you can reopen visual studio and verify that the rule gets loaded correctly


To verify the rule I’ve simply created one simple table with one valid column and one not valid column.


If you build the project you will get.


As you can see my custom rule gets executed, and the project will not compile until all columns are not named correctly.

Code can be found here .



vsdbcmd.exe, overriding variable value

When you deploy a database project with the command line utility vsdbcmd.exe you may want to change the value of some variables of the project. Suppose you’ve created a variable called Path1


And you have used this variable to specify the location of the files

ALTER DATABASE [$(DatabaseName)]
    ADD FILE (NAME = [Northwind], FILENAME = '$(Path1)$(DatabaseName).mdf',

Now you want to decide at deploy time the physical location of your database. The main problem is that vsdbcmd.exe takes deploy variable from the Database.sqlcmdvars file, so if you want to change a value of a variable you have two options. The first is using some xpath knowledge to change the value in the Database.sqlcmdvars, since it is a simple xml file.

<?xml version="1.0" encoding="utf-8"?>
<SqlCommandVariables xmlns="urn:Microsoft.VisualStudio.Data.Schema.Project.SqlCmdVars">

With some Xpath knowledge you can change the value of the property Path1, suppose you change to v:\ (v is my ramdisk), then you run the vsdbcmd.exe and you will find that files are created indeed in the correct location


If you do not like using XPath to change the configuration file, you can use a series of preconfigured files and swap before calling vsdbcmd.exe. Sadly the documentation states that you have an option to choose variable file


But actually you get an error when you use it.



Data Dude for Visual Studio 2010

For those out there that are, like me, fans of Data Dude, here is a link that shows you the news about Data Dude in VS2010.

The most appealing part is that “database projects” are now supported starting from VS professional edition, so I’m expecting that more and more people will try to manage databases with Visual Studio. Clearly most of the features are still available only for VS Premium or VS Ultimate users, but at least, people with professional can compile a DB Project, Execute unit testing or data generation plan (but only people with Premium and Ultimate are able to create and manage them).

This is especially important in large teams, where not all the developers can work with higher version of VS. With 2008 a user with professional cannot even open a DB Project, and the only solution is to create different solutions, one for the guys with professional, and the other for those with Data Dude. With VS2010 we can forget this problem.

Another good thing is that now the TFS Build Agent has the Data Dude bits installed, so it does not require you to install the full Visual Studio Database Edition on all Build Machines. This is another great news for Continuous Integration Addicted folks.



Data Driven Test of a stored procedure in Database Edition

Database Tests are really a great feature of Database Edition, and, like other tests, you can create a Data Driven Test of a stored procedure with little effort. The first stuff I do is to create a database project that will contains a table for each sets of data I want to use in a test.


In this project I create a table for each sets of data, and create a script to preload the data as post-deployment script. In this example I want to test a stored procedure, that accepts a string filter, and for each filter I want to test the number of rows returned. To achieve this I need a set of test data with two column, a filter column and a ExpectedNumberOfResult column. I create this table in the TestDatabase Database project


Ok, now here is the TestDataSet1 deployment script

-- =============================================
-- Script Template
-- =============================================

    TRUNCATE TABLE [dbo].[TestDataSet1]
    INSERT [dbo].[TestDataSet1] VALUES('da', 1)
    INSERT [dbo].[TestDataSet1] VALUES('e', 18)
    INSERT [dbo].[TestDataSet1] VALUES('det', 0)


I simply need to remove all old data, and preload with real data. Now here is the stored I want to test.

CREATE PROCEDURE [dbo].[GetCustomerByName]
    @filter    nvarchar(30)
    SELECT * from
    Where [Name] like '%' + @filter + '%'

Absolutely simple, and I’ve already generated a DataGenerationPlan and already verified that searching with ‘da’ will return only one column, with ‘e’ it will return 18 columns etc etc. To save Time I simply right click and “Deploy” the test database only when the test data changes, clearly this database is readonly, because tests use it only to data drive the test. Then I create a very simple Stored proceudre Test, and modify in this way


We need to notice a couple of stuff, first of all the @filter parameter was not declared in T-Sql code, this because we will take it from the test table, the other peculiarity is that the NumberOfRowReturned is a single condition that expects 0 rows to be returned, now it is time to dataDrive the test. The whole operation is covered in this article, here is a brief summary. First of all in the test we need to modify App.Config, you need to add a section

type="Microsoft.VisualStudio.TestTools.UnitTesting.TestConfigurationSection, Microsoft.VisualStudio.QualityTools.UnitTestFramework, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"/>

And then the section parameter

      <add name="SqlClientDataSource" 

I’m telling that I want to create a new data source to data drive my stored procedure test, this data source will be called SqlClientDataSource, it uses the SqlClientConn connection string, it will use the TestDataSet1 table and the access method is sequential, it means that it will run the test for each row in the table.

Now the fun part begins, because you will need to instruct the test to use this data, so you will need to “Right click” the database test and choose “View Code”. For those not used to database testing, a database test is composed by a simple C# or VB wrapper test that permits you to use a designer, but in the background there is the usual test structure. The first step is to choose the data source

public void dbo_GetCustomerByNameTest()

Now you need to pass the filter parameter to the stored.

String NameFilter = (String)TestContext.DataRow["NameFilter"];

DbParameter[] p = new DbParameter[1];

p[0] = this.ExecutionContext.Provider.CreateParameter();
p[0].Direction = ParameterDirection.Input;
p[0].ParameterName = "@Filter";
p[0].Value = NameFilter;

The code is really really simple, since it is a Data Driven test, the TestContext has the DataRow populated with the content of the TestDataSet1 table, so I take the NameFilter column value and then create an array of one DpParameter, named @filter that will be passed to the test. Now I need to modify the test condition because I need to set up the number of expected row count, this is a matter of another few lines of code.

int ExpectedResultCount = (int)TestContext.DataRow["ExpectedResultCount"];
DatabaseTestActions testActions = this.dbo_GetCustomerByNameTestData;
RowCountCondition rowCountcontidion =
    (RowCountCondition) testActions
                            .Single(c => c.Name == "NumberOfRowReturned");
rowCountcontidion.RowCount = ExpectedResultCount;

IF this code seems an Hack, you should consider how the designer will code the test under the hood. First of all I store in the ExpectedResultCount the number of rows that I’m expecting to be returned from the stored, then, with a simple LINQ query, I take the condition named “NumberOfRowReturned”, that is the name I gave to the condition into the designer. Since I already know that it is a row count condition I can cast it to the right type RowCountCondition, and finally change the RowCount property. The final step is passing the parameter array to the test

System.Diagnostics.Trace.WriteLineIf((testActions.TestAction != null), "Executing test script...");
ExecutionResult[] testResults = TestService.Execute(this.ExecutionContext, this.PrivilegedContext, testActions.TestAction, p);

This can be achieved with an overload version of the Execute method, that accepts the array of parameter as last argument. Now you can run the test and look at the result.


As you can verify, the stored was called three time, and for each run a different assertion was run, to verify it try to change the post-deploy script of the testDAtabase project adding a row

TRUNCATE TABLE [dbo].[TestDataSet1]
INSERT [dbo].[TestDataSet1] VALUES('da', 1)
INSERT [dbo].[TestDataSet1] VALUES('e', 18)
INSERT [dbo].[TestDataSet1] VALUES('det', 0)
INSERT [dbo].[TestDataSet1] VALUES('a', 22)

The last row tells that we expect 22 customers with an ‘a’ in the name, but I already know that my data generation plan generates only 17 records with this condition, now you can run test again and this is the output.


Now you can verify that the test was run 4 times, and the last one it fails.

This technique is very powerful, because you can test a stored procedure or trigger, several time with only a test definition, just exercising it with different input data taken from another database.



Faster Database Test With database in Virtual Disk

When you work with Visual Studio Database Edition, you surely love the data generation plans, and the ability to do database unit testing with automatic deploy and automatic preload. There is only one thing that is bad with database testing, they are usually slow because they are accessing disks.

To speedup database testing the best solution is to test database in memory, but with sql server it can be difficult…or not? With a simple software that create a RAMDisk, (a disk made of ram like Dataram one), you are able to create RAMDisk with few clicks and have a disk that actually runs at memory speed. I’ve created a 256 mb ramdisk called V:\ and now I want to test against a database stored in that disk.

If you do not have database edition, it can be tricky, but with the Database Edition, doing this is a joke. First of all expand the properties of the project, and double click on database.sqlcmdvars.


Now create a new variable that points to the virtual disk.


Now goes to “Schema Objects"/Storage/Files" and for each file of the target database edit the script changing the value $(defaultDataPath) to the new variable.

ALTER DATABASE [$(DatabaseName)]
    ADD FILE (NAME = [MyNameOfPrimaryFile], FILENAME = '$(VirtualDiskDataPath)$(DatabaseName).mdf', SIZE

And the game is done, now when you deploy the database files gets created on the virtual disk and your database testing against a sql server database will be run faster, because database files are in memory. Thanks to Database Edition, this can be achieved with a really few clicks.