Wednesday, March 12, 2008

Dimension Security Issues with Excel 2007 and SSAS 2005 Parent-Child Hierarchies

I've seen and read a lot of different posts and blog entries related to Excel 2007 and SSAS 2005 issues. Mostly things that don't work as planned with regards to displaying and using calculated members, incorrect query results given Excel's subcube-based filtering and its interaction with default members, etc. But here's one that I don't think I've run across yet that actually causes an error to occur while browsing a cube...

Suppose you have dimension security set up on a parent-child dimension. This is a common scenario in a sales cube or any other cube that contains a parent-child dimension related to people within an organization. In most cases, individuals should be restricted to just seeing themselves plus their subordinates within the parent-child hierarchy. Typically, I've seen this done by including AD logins as an attribute in the dimension and then using an MDX expression to restrict the allowed set of members within the parent-child hierarchy to just the user currently logged in (which then includes their subordinates by default in a parent-child hierarchy).

Here's a quick example using Adventure Works. I've created a new role in the Adventure Works SSAS solution named "Employees Security.role". This role specifies dimension data security based on the [Employee] dimension within the [Adventure Works] cube. Security is enabled on the [Employees] attribute within the dimension using the following MDX expression for the Allowed Member Set property:

[Employee].[Employees].CurrentMember.Properties("Login ID") = "adventure-works\amy0"

Note that in this example, I'm referring directly to the AD login for the employee named Amy E. Alberts; normally, the MDX expression would use the UserName function to return the AD login of the current user. Doing it this way allows me to easily test the security role. Also note that I have Visual Totals enabled for the security role. Here's what the security role looks like:

Dimension Security Settings for Employees Security Role

With this in place, I can browse the [Adventure Works] cube. When I browse the cube without using the security role, I can see all of the employees in the [Employees] parent-child dimension as shown below:

All Employees Visible with No Security Role

However, when I connect to the cube through the security role (using the Change User button on the browser toolbar in either the BIDS cube designer or in the SSMS cube browser window), I'm relegated to seeing just Amy (along with her superiors -- required so that I can drill down to Amy) and her subordinates as shown below:

Only Secured Employees Visible with Security Role

Works great. Notice that Visual Totals has kicked in with the role so that I cannot see any numbers beyond those of Amy. This even works well within Excel 2003. If I connect and create a pivot table using the standard features in Excel 2003, I can see all of the employees when not connected through the security role:

Excel 2003 PivotTable with No Dimension Security

If I update the .oqy file created by Microsoft Query within Excel 2007, I can add "Roles=Employees Security" to the end of the connection string to get Excel 2003 to connect through the security role. After doing this, when I create the same pivot table as seen above, dimension security kicks in as I would expect it to:

Excel 2003 PivotTable with Dimension Security

Notice that Visual Totals is working correctly and I'm limited to seeing Amy and her subordinates (and her superiors again in order to provide a drill-down path to get to her). Fabulous!

However, when I try this same thing using Excel 2007, it just doesn't work. First, I'll create an Excel 2007 pivot table without any dimension security in place. This works fine using the parent-child hierarchy:

Excel 2007 PivotTable with No Dimension Security

Okay so far. Now, if I update the connection used by the pivot table (highlight the Data option on the Excel ribbon, click on the Properties toolbar button in the Connections group, then add "Roles=Employees Security" to the end of the Connection String on the Definition tab of the Connection Properties dialog box), I can attempt to recreate the pivot table. When I add the [Employees] parent-child hierarchy along with the [Reseller Sales Amount] measure (as done above), things look promising (and you can see that Visual Totals has kicked in):

Excel 2007 PivotTable with Dimension Security

However, as soon as I try to drill-down to get to Amy, boom!!

Error in Excel 2007 PivotTable with Dimension Security

Ugh. And to date, I've not been able to find a work-around. Anyone out there who has ideas on how to handle this, please let me know. I've tried defining the MDX expression in the security role in a couple of different ways and I've tried to tweak the pivot table settings in Excel 2007 in various ways. All to know avail. Uh, Microsoft, can we get this fixed (along with all the other ugly issues that exist when using Excel 2007 as a client for SSAS??)

SSIS Package to Back Up SSAS Databases

There are a couple of different posts and blog entries out there that discuss backing up SSAS databases using various techniques such as AMO, SSIS, etc. I recently created an SSIS package to do the same thing, and I think it's a little bit simpler than some of the other solutions out there.

The basic design of the package is to use a ForEach Loop container to loop over the list of databases on a given SSAS server (versus using AMO or some other method of getting a list of databases) and then using an Analysis Services DDL task with an XMLA script that calls for a backup of the current database.

The figure below shows the overall control flow in the package:

Control Flow for Backup SSAS Databases Package

The ForEach Loop container uses the ADO.NET Schema Rowset Enumerator along with an ADO.NET connection to the SSAS server. Thus, the package contains a Connection Manager using the .Net Provider for OLEDB along with the OLE DB Provider for Analysis Services 9.0. Thus, the Connection Manager looks like this:

Connection Manager for ADO.NET Connection to SSAS

Notice the Provider selected, the Server name entered, and the fact that no Initial Catalog is selected (since we're going to loop over all of the catalogs with the ForEach loop container). In this case, the Connection Manager just connects to the local SSAS server, but that could be dynamically handled with a package variable and a property expression on the Connection Manager.

The important settings for the ForEach Loop Container look like this:

Collection Settings for ForEach Loop Container

Variable Mappings Settings for ForEach Loop Container

Note that the Collections settings specify the ADO.NET Schema Rowset Enumerator, the defined Connection Manager, and the Catalogs schema. Also note that the Variable Mappings tab specifies that the catalog name returned by the enumerator be stored in a package variable named DatabaseName.

Within the ForEach Loop container, you'll see a Script Task as well as an Analysis Services Execute DDL Task. The Script Task is used to generate the XMLA script needed to backup the current database and the Analysis Services Execute DDL Task then executes that XMLA script. The Script Task is configured as follows:

Script Settings for Script Task

Note that the DatabaseName variable is passed in as a read-only variable while another package variable named BackupScript is passed in as a read-write variable. The script within the task looks like this:

Public Sub Main()
' Add your code here

Dts.Variables.Item("BackupScript").Value = _
"<Backup xmlns="""">" + _
" <Object>" + _
" <DatabaseID>" + Dts.Variables.Item("DatabaseName").Value.ToString + "</DatabaseID>" + _
" </Object>" + _
" <File>" + Dts.Variables.Item("DatabaseName").Value.ToString + ".abf</File>" + _
" <AllowOverwrite>true</AllowOverwrite>" + _

Dts.TaskResult = Dts.Results.Success

End Sub

Sorry for the additional line wrapping in the code above, but the important point is that the BackupScript variable will contain a well-formed XMLA script for backing up the database named in the DatabaseName variable. The script is fairly simple in nature as it simply uses a backup filename of the database name with .abf appended and it assumes the backup file can be overwritten. Other scenarios could be handled (time-stamping the backup filenames, for example) via additional code.

Once the Script Task executes, the BackupScript variable is then ready to use to backup the current database (within the context of the ForEach Loop container). The Analysis Services Execute DDL Task is configured to execute the backup script as follows:

DDL Settings for AS Execute DDL Task

Note that the Analysis Services Execute DDL Task uses a native OLE DB provider to connect to SSAS, not an ADO.NET provider. Thus, a second Connection Manager is needed within the package to handle this connection. It is configured as follows:

Connection Manager for OLE DB Connection to SSAS

Note that it is configured the same as the ADO.NET-based Connection Manager above, but uses a Native OLE DB provider. Also note that the server name here could be made dynamic via a variable and a Property Expression as well.

And that's it. If you run the package, it will loop through all of the databases on the SSAS server and backup each one. If this needs to be done on a scheduled basis, the package can be easily deployed to one or more SSAS servers and scheduled using SQL Server Agent.

I've pushed a copy of the package up to my SkyDrive in case you want to take a look or even use it (as is of course!) yourself. Here's a link to the zip file there:

Let me know what you think...

Wednesday, March 5, 2008

Current Date Parts for SSRS Reports using SSAS Cubes

Recently, while working with a client on the development and deployment of a set of SSRS reports using SSAS as a data source, we came up with a nifty method of defaulting date-part report parameters to the current date.

When creating SSRS reports based on SSAS, a common scenario is to want date-part parameters that map to various attribute hierarchies within a date dimension. For example, in most SSAS solutions, a date dimension has attribute and user hierarchies that contain parts of a date. If these are added to a report as parameters, you might want the various date parts to default to parts of the current date. So, a report parameter based on a year attribute might need to default to the current year; a report parameter based on a month attribute might need to default to the current month.

Most solutions that I've seen to this problem go about it via MDX (by adding new "current date" calculated members to the date dimension) or via SSRS expressions (using various functions to create the right default values). The solution we came up with, however, relies on the fact that the date dimension in SSAS is typically processed from an underlying date table in the relational database used as the data source for SSAS. Let's look at an example of how this might work (using Adventure Works).

First, let's create a new report for Adventure Works using SSAS as the data source. In an SSRS project, you can use the Report Wizard to create a new report. Let's assume you need to create a report showing reseller sales, allowing users to filter the report by year and month. Using the Query Builder made available in the Report Wizard, we can create the following query for the report:

SSAS Query

Notice the parameters that are included (for the [Date].[Calendar Year] and [Date].[Month of Year] attributes) along with the default values ([CY 2001] and [July] respectively) that I selected. After finishing the Report Wizard and then applying a bit of formatting to the report, here's what it might look like:

Initial Reseller Sales Report

Notice the report parameters at the top, along with the default values that got filled in (based on the selections I made when the parameters were created in the Query Builder). If you inspect the report parameters, you can see what the format of each of the default values looks like:

Year Parameter Default Value Format

Month Parameter Default Value Format[4]

Notice that the default for the year parameter is the string "[Date].[Calendar Year].&[2001]" and the default for the month parameter is the string "[Date].[Month of Year].&[7]". If you look at the designs of these attributes in the SSAS solution, these map to values in the key columns DimTime.CalendarYear and DimTime.MonthNumberOfYear.

Now, suppose users of this report want the year parameter to default to the current year and the month parameter to default to the current month. Within the AdventureWorksDW relational database, the following query returns the key columns listed above for each date in the DimTime table:


If we apply some formatting to the columns returned, we can get values back that look like the default parameter strings noted above:

'[Date].[Calendar Year].&['
+ convert(varchar, CalendarYear)
+ ']' as CalendarYearValue
,'[Date].[Month of Year].&['
+ convert(varchar, MonthNumberOfYear)
+ ']' as MonthofYearValue

And if we apply a where clause to the query, then we can return the formatted values for a single date -- in particular, the current date (likely using the getdate() function). Now, in the AdventureWorksDW database, the DimTime table doesn't have dates up through the current date; thus, we'll simply ask for the maximum date in the table. And we'll wrap this query into a stored procedure named dbo.GetDefaultDateParts:

create procedure dbo.GetDefaultDateParts

'[Date].[Calendar Year].&['
+ convert(varchar, CalendarYear)
+ ']' as CalendarYearValue
,'[Date].[Month of Year].&['
+ convert(varchar, MonthNumberOfYear)
+ ']' as MonthofYearValue
-- FullDateAlternateKey = convert(varchar, getdate(), 101)
FullDateAlternateKey =

With this stored procedure in place, we can now update the report to use these new default values. First, we need to add a new data set to the report, based on the AdventureWorksDW relational database, calling this new stored procedure. Here's what the properties for the new data set might look like:

Data Set for GetDefaultDateParts Stored Procedure

And here's what a query of the data set would return:

GetDefaultDateParts Data Set Results

So, with this new data set available, we can simply update the report parameters such that each parameter's default value comes from the data set (instead of being based on the values that were originally selected when the report parameters were created). Here are the updated report parameters:

Updated Calendar Year Report Parameter

Updated Month of Year Report Parameter

Note that each parameter has been set to be single-valued (although this doesn't necessarily have to be the case) and that each parameter now gets its default value from the appropriate field of the new data set.

Finally, previewing the report will show that the parameters get filled in with our new default values (2004 for the year and August for the month):

Report with New Default Values for Parameters

Now, in this particular case, it looks like Adventure Works didn't have any reseller sales for August 2004. So, the default values we created via the new stored procedure might not be optimal. But, now that we have a stored procedure for returning the default values, it could be updated to return the date parts for the last date with data, the last date of the last "closed" financial month, or just about anything else that makes sense for our users and our business scenario. The stored procedure could even be updated to accept parameter values to control it in various ways, with different reports calling for different dates for example.

There are certainly some cases in which this may not work. For example, if you are using an SSAS-based date dimension (with no underlying date table in the relational database) or if have a very slim date dimension table and are creating additional columns within your DSV. So, your mileage may vary. But, for quite a number of scenarios and situations, this seems to work out well and avoids some of the complexity I've seen people create just to do this type of thing.

I've uploaded a copy of the Reseller Sales report sample displayed above along with a T-SQL script for creating the stored procedure. Both are available here:

Give it a try and let me know if this seems like a useful technique...