Deploying tabular projects using a custom msbuild task
I have received a few questions from customers about how to build and deploy Analysis Services projects in an unattended way, for integration into an automated build process. The stock reply is a multi-step solution, where you build from command line and then deploy either by writing your own program (using AMO) or by calling the Deployment Wizard from the command prompt. It’s not exactly an elegant solution, but it works.
Here is a slicker trick for tabular projects Denali. As I pointed out in my previous post on the tabular project structure, tabular projects use msbuild to build. One of the great things about msbuild is its extensibility. So why bother with a multi-step build and deploy process when you could use msbuild extensibility to turn build into a single step process that both builds and deploys your models. All you have to do is create a custom post-build task that performs the deployment. Since we now have AMO for PowerShell (pretend this documentation exists, it will get there), it is just a matter of executing a PowerShell script that calls Invoke-ASCmd after the build output has dropped.
What I thought was a simple project took a bit of time to implement. However, now that I have the solution, you can go and implement this at home in very little time at all.
Here is how you do it:
Ensure that your machine that you will use for building has both BIDS and SSMS installed. This brings the build for .smproj and the PowerShell cmdlets for AMO.
Install the msbuild extension pack. This extension pack has a custom msbuild task for executing a PowerShell script, which we will use in this example.
Place the custom_post_build.targets file in a sensible directory, accessible to msbuild. In my example below, I will use C:\Temp, but be more sensible in real life.
Create a tabular project, say TabularProject1.smproj. Close the solution.
After closing the solution, do File->Open->File…, pick TabularProject1.smproj, OK. This opens the project file in the XML editor.
Insert the following line of code in the highlighted position:
<?xml version="1.0" encoding="utf-8"?> <Project ToolsVersion="4.0" DefaultTargets="Build" xmlns="https://schemas.microsoft.com/developer/msbuild/2003"> <PropertyGroup> <!--redacted for brevity--> </PropertyGroup> <PropertyGroup Condition=" '$(Configuration)' == 'Development' "> <OutputPath>bin\</OutputPath> </PropertyGroup> <ItemGroup /> <ItemGroup> <Compile Include="Model.bim"> <SubType>Code</SubType> </Compile> </ItemGroup> <Import Project="$(MSBuildExtensionsPath)\Business Intelligence Semantic Model\1.0\Microsoft.AnalysisServices.VSHostBuilder.targets" /> <Import Project="C:\temp\custom_post_build.targets"/> </Project>
Save and close the .smproj
From inside VS or from the command line, build the .smproj. Your project will build and deploy in one shot.
Let’s take a look at that custom_post_build.targets file. Don’t laugh at the PowerShell script and the billions of lines for text concatenation. I learned PowerShell specifically for this little project so efficiency, neatness, and elegance were sacrificed.
Anyway, what we have is a custom parameterized PowerShell task. The invocation of the task looks like this:
<Target Name="DeployDB">
<DeployDB OutPath="$(OutDir)" DBName="$(DeploymentServerDatabase)" ServerName="$(DeploymentServerName)"></DeployDB>
</Target>
What we are doing here is passing build properties specified in the .smproj file (which I redacted above) as parameters to the DeployDB build task, which is the task we are using to execute the PowerShell script.
The declaration of the DeployDB build task looks like this:
<UsingTask TaskFactory="PowershellTaskFactory" TaskName="DeployDB" AssemblyFile="MSBuild.ExtensionPack.TaskFactory.PowerShell.dll">
That line goes and loads that custom PowerShell build task from the msbuild extension pack, and names our PowerShell task DeployDB. Now I’ll skip some code and go to the PowerShell script inside.
First major step, load the cmdlets. Whenever you want to use the AS cmdlets outside of the sqlps snap-in, you have to explicitly import the module to call the cmdlets:
Import-Module 'C:\Program Files (x86)\Microsoft SQL Server\110\Tools\PowerShell\Modules\SQLASCMDLETS'
Next step, work around a CTP3 bug in Denali. The deployment database name wasn’t copied correctly into the .asdatabase file, so rewrite it using the real deployment DB name specified in the deployment properties. I will skip that initial code to construct the $oldDBID and $newDBID variables, it’s not necessary for RTM.
Next, we wrap the build output (an *.asdatabase file) in an Alter script, so we can execute it later.
$file = $OutPath+ (get-ChildItem ($OutPath += "*.asdatabase") -name)
$xmlaScript = "<Alter AllowCreate='true' ObjectExpansion='ExpandFull' xmlns='https://schemas.microsoft.com/analysisservices/2003/engine'>"
$xmlaScript = $xmlaScript += "<Object><DatabaseID>"
$xmlaScript = $xmlaScript += $DBName
$xmlaScript = $xmlaScript += "</DatabaseID></Object><ObjectDefinition>"
$xmlaScript = $xmlaScript += ( (get-content $file | out-string) | Foreach-Object {$_ -replace $oldDBID, $newDBID})
$xmlaScript = $xmlaScript += "</ObjectDefinition></Alter>"
Pay very special attention to that highlighted line. There are two things here. First, it is very important to pipe the contents of the *.asdatabase file to the out-string cmdlet. Otherwise, PowerShell will introduce unwanted line breaks into your .asdatabase file, which will break your model (cause measures not to be added to your database!) Second, the replace $oldDBID, $newDBID business should not be required at RTM.
I will skip my build log command, it’s optional and I just use it for debugging. Finally, the command where we invoke the deploy script:
Invoke-ASCmd -Server:$ServerName -Query:$xmlaScript
There you have it. This build task is totally parameterized, the same custom_post_build.targets can be reused across your organization. Also, if you don’t like my PowerShell, feel free to write your own. You could also get very fancy here. You could automatically process your model too. You could make some test suites to sanity check the numbers coming out of your model against the data sources.
Now you might not want everybody in your organization to hand-edit the .smproj to hook up with your custom build targets folder. Fear not. You can create a custom project template that does this for you, MSDN explains how. Making the template is a couple of clicks in the UI, it’s pretty cool.
Thank you Leonid Lyakhovitskiy for helping me write the build targets file.
Comments
- Anonymous
August 30, 2011
Nice article, however I would sugest a change.In the alter database xmla generation, the asdatabase is a serialization of the AMO database object so you could do something like:$database = New-Object Microsoft.AnalysisServices.Database$xmlReader = New-Object System.Xml.XmlTextReader($sourceAsDatabasePath)$xmlReader.ProhibitDtd = $true$sourceDatabase = [Microsoft.AnalysisServices.Utils]::Deserialize($xmlReader,$database)And then use the Microsoft.AnalysisServices.Scripter to generate the Alter command - Anonymous
September 14, 2011
Thanks, that is a good suggestion! - Anonymous
August 13, 2015
I'd like to use multiple configurations for .smproj, and found an article referring to your solution: www.blue-granite.com/.../applying-configurations-tabular-model-ms-build However, the mentioned "if...else" block is not within your TARGETS file. Have you posted it elsewhere or is this idea not one of yours at all?