Change Package Properties Using Parameters–SSIS 2012

In DTS, changing package properties was close to impossible (the less we talk about DTS, the better). Package configurations introduced in 2005 solved most of the problems. In 2012, SQL Server Integration Services introduces parameters to make this process even easier. Parameters allow you to change package properties without the need for configuration files.

In this example, we will create a variable and change it’s value at run time using parameters.

Let’s enter the Demoland!

In a SSIS package, I created a variable called PromptValue. I assigned the value “I’m Variable” to it.

SSIS Parameters 2012

Next, I added a script task to the package, entered PromptValue as a read only variable, and used the following script.

MsgBox(Dts.Variables(“PromptValue”).Value.ToString)

I see the following message box when I execute the package.

SSIS show messgae box

Now, we want to be able to change this value during execution. Here is where we would create a package configuration and add this variable to it, so we can simply change the configuration file without editing the package.

But this time we will use parameters. I added a parameter called ParmPromptValue and assigned “I’m Parameter” as the value. Parameters are of two types: Package and Project. Package parameters can be used within the package. Project parameters can be used by all the packages in a project.

I created a package parameter.

ssis package parameters

Next, we’ve to pass the value of this parameter to the variable we created earlier. We will use expressions to do that. We know we can use other variables in the expressions builder. In addition to that, we can use parameters. I dragged the package parameter @[$Package::ParmPromptValue] to the expression window.

image

What should we see when I execute the package? Should it be [I’m Variable] or should it be [I’m Parameter]?

Since we used an expression to the variable, the value of the parameter overwrites the value of the variable.

image

We’ve so far executed the package from Visual Studio. In the below image, I deployed the package to SSIS catalog and assigned a different value to the parameter. More about deployment in the next post.

image

This is what I see when I run package from SSIS Catalog (Server).

image

Why in the world would you want to change the value like in this example, you might ask. I agree. This is a fictitious example, but think of connection strings, database names, user name/passwords that you can’t live without changing when moving packages through environments.

I’ve provided an introduction to parameters in this post. Be sure to read the following for more information. Parameters in Denali by Jamie Thomson and this from TechNet Wiki.

Resources

Download the sample package from here.

~Sam.

Follow Sam on Twitter – @SamuelVanga

Advertisements

Do You Know These Free Events?

I frequently attend a few online training events that greatly help me learn new things. Thought I would share, if may be you are interested too. These are all cool people sharing information for free to the community.

Go get'em!

All the sessions are recorded and posted on their respective blogs. If you miss the live event you can always catch up later.

Do you know something else that isn’t here? Please share. Drop a comment below. And SPREAD THE WORD, would ya!

~Sam.

Why Isn’t There A Deployment Server Edition For SSIS Projects?

Andy Leonard blogged about a gotcha when creating SSIS projects using SSDT yesterday. He showed that you can use any component in SSDT whether or not your SQL Server supports that component. For example, you can create a package with Fuzzy Lookup and deploy to a server running Business Intelligence edition (Fuzzy Lookup isn’t supported on this edition), you won’t even be warned until the package bombs when executed from command line or SQL Agent.

Rafi Asraf made a comment on that post about how it plays well with SSAS projects. I’ll try to elaborate that comment here.

There is a property called Deployment Server Edition for SSAS projects.

SSAS Multidimensional.

DeploymentEdition_SSASMulti

SSAS Tabular.

DeploymentEdition_Version

I selected Standard edition. Perspectives aren’t supported in this edition. When I try to create them, SSDT shows a warning message. This is friendly.

image

Now, why a similar property isn’t available for SSIS projects? That, my friend, is beyond the scope of my brain. Adding it will definitely save a lot of time and reduce confusion for developers.

~Sam.

Follow Sam on Twitter – @SamuelVanga

Execute T-SQL Statement Task

I spotted a new task in SSIS 2012Execute T-SQL Statement Task. It executes T-SQL statements, only Transact-SQL statements against SQL Server databases. Cool, but don’t get too excited. Why? Because you can neither parameterize the SQL nor save the results to a variable.

image

The task editor looks like below:

image

Only plain and static scripts can be executed, for which Execute SQL Task can be used anyway. So, I don’t see a specific need for this task and I don’t plan on using it in the near future. Do you agree?

~Sam.

Update: Please note, this task isn’t new in 2012. It existed in previous versions. I apologize for the incorrect information. Thanks to Devin Knight (b|t) for correcting me.

Simple Tips To Speed Up Your SSIS Data Flow

There are a few design choices that can make your SSIS packages run super fast when loading data from table A to table B.

SQL Command at Source

OLE DB source editor allows you to choose either a table or a view from a drop down if data access mode is Table or View. This will execute a SELECT * command on the data source. As a best practice, it is always better to list the columns even when you are selecting all of them. Choose SQL Command and type in the query.

SQL Command Data Access Mode

Table or View – Fast Load at Destination

As the name suggests, Table or View – Fast Load is the fastest way to load data to destination. It applies a table lock on the destination table and performs bulk insert. It is in arguably the fastest way to insert data.

My package on a machine with 8GB RAM and 4 processors ran for 58 seconds to load 7.9 million records. Compare this to 50 minutes when using Table or View as the data access mode. This option works similar to a cursor – inserts one record at a time. Clear evidence for it’s poor performance.

imageimage

50 minutes? hell No! But 58 seconds to load about 8 million records isn’t bad. Right?

Let’s dig a little deeper to see if we can tune this even more.

Default Buffer

SSIS relies heavily on buffer. A set of records are read into the buffer, and from buffer they are written to the destination. This process continues until all rows are written to destination. For instance, in this example, buffer is carrying 9,216 rows at a time. You can see that by using data viewers.

image

This number is controlled by two properties DefaultMaxBufferRows and DefaultMaxBufferSize. The values are 10,000 rows and 10MB by default, whichever comes first. MaxBufferSize is 100MB – it’s the maximum size of rows that can be fitted in buffer.

EstimatedRowSize is another property that is calculated based on the metadata of our result. This is 32 bytes for my data set. (BIGINT = 8B + INT = 4B + DATETIME = 8B + INT = 4B + MONEY = 8B) = 32 Bytes.

image

Buffer reached the default maximum allowed with 9,216 rows, that amounts to a size of 9,216 (rows) * 32 (bytes for each row) = 294912 Bytes which is less than 1MB, remember maximum is 100MB. There is a lot of free space left on the buffer that can be used. Filling this will result in shorten trips thus increasing performance.

For this test, I left DefaultMaxBufferSize as 10MB, but increased DefaultMaxBufferRows to 30,000 rows. The package now runs in less than 30 seconds.

image

Changing the settings to DefaultMaxBufferSize = 90MB and DefaultMaxBufferRows = 60,000 rows resulted in the package to execute in 15 seconds.

Wrap Up…

Understanding the internals sometimes will yield great performance without the need for additional hardware. This package is running on my local machine and data is being moved within the same database. Off course results might vary depending on your environment.

~Sam.