Thursday, December 17, 2009

Designing an ODS / DW with high availability and consistency

Designing an ODS / DW with high availability and consistency

It's widely recognized that database sizes are growing significantly, and that the growth is being forced by many factors, such as companies requiring more data to be available online for longer (e.g. to comply with government regulations) or an increasing amount of data being digitized for storage. This extent of data explosion has given a momentum for business intelligence application as well. The business intelligence application gathers and stores data for analyzing historical, current, and predictive views of business operations. The gathering and storage of data, on which these business analytics are done, is done by either data warehouse, data mart or ODS (Operational Data Store). Because now-a-days the success of the business is heavily influenced by these business intelligence applications for better and informed business decisions which further rely on data warehouse or ODS for its data feed, it becomes very essential to design a highly available data warehouse or ODS which provides consistent data all the time.

In this video/paper I am going to discuss different approaches (or some of the many available approaches) which you can take to design an ODS for its high availability and data consistency, I will start my discussion with a very basic approach and will list down its pros and cons. Gradually I will move on to the better approach than previous one in terms of its availability and consistency. And finally I provide you some strategic ODS design decision choices and best practices to consider while designing and maintaining it.

Though going forward I will be referring to an ODS design only but same approaches can also be applied for data warehouse as well. You can watch the video or download the deck and article as per your convenience, click here for more details.

SQL Server 2008 R2 - SQL Azure Enhancements

SQL Server 2008 R2 - SQL Azure Enhancements
If you were unhappy with the capabilities of SQL Server Management Studio (SSMS) while working with SQL Azure, then there is good news for you. Microsoft has announced the November CTP for Microsoft SQL Server 2008 R2. The SSMS of this version allows you to work with SQL Azure in almost the same way as when you are connected to a local SQL Server. In other words, now you can use your favorite Object Explorer in SSMS to browse through the database objects hosted in SQL Azure as well. In this article, I am going to show how you can use SSMS’s Object Browser to connect/browse to SQL Azure database. For more details click here.

SQL Azure - Starting up...

SQL Azure - Learning from scratch....

There has been lots of buzz about cloud computing lately and looking at the benefits it provides (in terms of cost savings, high availability, scalability (scale up/down) etc.) it is now evident that cloud computing is the future for next generation applications. Many of tomorrow's applications will be designed and hosted in the cloud. Microsoft realizes this potential and provides a cloud computing solution with Windows Azure. Windows Azure platform, which is hosted inside Microsoft data centers, offers several services which you can leverage while developing your application if you target them for the cloud. One of them is Microsoft SQL Azure, it's a cloud based relational database service built on Microsoft SQL Server technologies. In this article, I am going to show how you can start creating databases and database objects on the cloud with SQL Azure. For more details, click here.

Tuesday, November 24, 2009

SQL Server Integration Services ( SSIS ) - Best Practices

SQL Server Integration Services ( SSIS ) - Best Practices
Part 1 briefly talks about SSIS and its capability in terms of enterprise ETL. Then it gives you an idea about what consideration you need to take while transferring high volume of data. Effects of different OLEDB Destination Settings, Rows Per Batch and Maximum Insert Commit Size Settings etc. For more details click here.
Part 2 covers best practices around using SQL Server Destination Adapter, kinds of transformations and impact of asynchronous transformation, DefaultBufferMaxSize and DefaultBufferMaxRows, BufferTempStoragePath and BLOBTempStoragePath as well as the DelayValidation properties. For more details click here.
Part 3 covers best practices around how you can achieve high performance with achieving a higher degree of parallelism, how you can identify the cause of poorly performing packages, how distributed transaction work within SSIS and finally what you can do to restart a package execution from the last point of failure. For more details click here.
Part 4 talks about best practices aspect of SSIS package designing, how you can use lookup transformation and what consideration you need to take while using it, impact of implicit type cast in SSIS, changes in SSIS 2008 internal system tables and stored procedures and finally some general guidelines. For more details click here.

Wednesday, October 28, 2009

Basic Storage Modes (MOLAP, ROLAP and HOLAP) in Analysis Services

Basic Storage Modes (MOLAP, ROLAP and HOLAP) in Analysis Services
There are three standard storage modes (MOLAP, ROLAP and HOLAP) in OLAP applications which affect the performance of OLAP queries and cube processing, storage requirements and also determine storage locations. To learn more about these standard storage modes, pros and cons of each one, click here.

Database Impersonation with EXEC AS in SQL Server

Database Impersonation with EXEC AS in SQL Server
SQL Server 2005/2008 provides the ability to change the execution/security context with the EXEC or EXECUTE AS clause. You can explicitly change the execution context by specifying a login or user name in an EXECUTE AS statement for batch execution or by specifying the EXECUTE AS clause in a module (stored procedure, triggers and user-defined functions) definition. Once the execution context is switched to another login or user name, SQL Server verifies the permission against the specified login or user (specified with EXECUTE AS statement) for subsequent execution instead of the execution context of current user. To learn more about this feature and how it works click here.

Spatial Data Types (GEOMETRY and GEOGRAPHY) in SQL Server 2008

Spatial Data Types (GEOMETRY and GEOGRAPHY) in SQL Server 2008
SQL Server 2008 provides support for geographical data through the inclusion of new spatial data types, which you can use to store and manipulate location-based information. These native data types come in the form of two new data types viz. GEOGRAPHY and GEOMETRY. These two new data types support the two primary areas of spatial model/data viz. Geodetic model and Planar model. Geodetic model/data is sometimes called round earth because it assumes a roughly spherical model of the world using industry standard ellipsoid such as WGS84, the projection used by Global Position System (GPS) applications whereas Planar model assumes a flat projection and is therefore sometimes called flat earth and data is stored as points, lines, and polygons on a flat surface. To learn more about this new feature click here.

FILESTREAM Data Type in SQL Server 2008

FILESTREAM Data Type in SQL Server 2008
The new SQL Server 2008 FILESTREAM data type enables SQL Server applications to store unstructured data, such as documents and images, on the file system with a pointer to the data in the database. This enables client applications to leverage the rich NTFS streaming APIs and performance of the file system while maintaining transactional consistency between the unstructured data and corresponding structured data with same level of security. Backups can include or exclude the binary data, and working with the data is with the standard SELECT, INSERT, UPDATE, and DELETE statements in T-SQL. FILESTREAM storage is implemented as a varbinary(max) column in which the data is stored as BLOBs in the file system. The sizes of the BLOBs are limited only by the volume size of the file system. The standard varbinary(max) limitation of 2-GB file sizes does not apply to BLOBs that are stored in the file system. To learn more about this new feature click here.

Large User Defined Types in SQL Server 2008

Large User Defined Types in SQL Server 2008
With SQL Server 2005, Microsoft integrated the .NET Common-Language Runtime (CLR) into the database engine itself, with that now you were allowed to create used defined type (UDT) and use it in SQL Server in a similar way as you use any in-built data type once assembly containing UDT is registered into the database. It was good starting point, but the problem with it is, the size of UDT is limited up to 8000 bytes only. SQL Server 2008 overcomes this limitation by introducing Large User Defined Type and increases size all the way to go upto 2GB. Learn more here.

HIERARCHYID Data Type in SQL Server 2008

HIERARCHYID Data Type in SQL Server 2008
SQL Server 2008 has introduced a new data type HIERARCHYID to store hierarchical data in database table. HIERARCHYID is a variable length system data type, and used to locate the position in the hierarchy of the element. The HIERARCHYID data type is optimized for representing trees, which are the most common type of hierarchal data. The HIERARCHYID data type should be used to represent the position in a hierarchy, that is, a column of type HIERARCHYID does not represent a tree itself, but rather it simply represents the position of a row/node within a defined tree. HIERARCHYID data type exposes many different methods which can be used to retrieve a list of ancestors and descendants as well as a means of traversing a tree etc. For more details click here.

Tuesday, October 27, 2009

New Date and Time Data Types in SQL Server 2008

SQL Server 2008 introduces four new DATETIME data types which are more optimized for type of usage and memory requirement, along with DATATIME2 which is now SQL compliant and compatible with .Net type DATETIME. To learn more about it and how it works click here.

User-Defined Table Type and Table Valued Parameter (TVP) in SQL Server 2008

User-Defined Table Type and Table Valued Parameter (TVP) in SQL Server 2008
With SQL Server 2008, you can create a user-defined table type which represents the definition of a table structure. To ensure that the data in a user-defined table type meets specific requirements, you can also create unique constraints and primary keys on this type. Further, to send multiple rows of data to a stored procedure or a function without creating a temporary table or many parameters, you can use a user-defined table type to declare table-valued parameters for stored procedures or functions.
Table-valued parameters offer more flexibility and in some cases better performance than temporary tables or other ways to pass a list of parameters. Table-valued parameters offer the following benefits; for example it does not acquire locks for the initial population of data from a client, it does not cause a statement to recompile, reduce round trips to the server, enable the client to specify sort order and unique keys etc. To learn more about these new exciting features and how to use it from .Net application, clich here.

Tuesday, October 6, 2009

Backup and Restore SQL Server databases programmatically with SMO

Backup and Restore SQL Server databases programmatically with SMO
In this article I am going to provide examples to SQL Server Database Administrators on how to backup and restore SQL Server databases programmatically with SMO. I will start with how you can issue different types (Full, Differential and Log) of backups with SMO and how to restore them when required programmatically using SMO. Click here for more details....

Generate SQL Scripts for database objects with SMO

Generate SQL Scripts for database objects with SMO
In this article I take about how you can generate SQL object scripts programmatically. Though you can do this through SQL Server Management Studio (SSMS) there might be times (more details on usage scenarios given below) when you would need to create SQL scripts automatically. Click here for more details....

Accessing SQL Server programmatically with SQL Server Management Objects (SMO)

Accessing SQL Server programmatically with SQL Server Management Objects (SMO)
SQL Server 2005 and 2008 provide SQL Server Management Objects (SMO), a collection of namespaces which in turn contain different classes, interfaces, delegates and enumerations, to programmatically work with and manage a SQL Server instance. SMO extends and supersedes SQL Server Distributed Management Objects (SQL-DMO) which was used for SQL Server 2000. In this article, I discuss how you can get started with SMO and how you can programmatically manage a SQL Server instance with your choice of programming language. Click here for more details....

Change Tracking in SQL Server 2008

Change Tracking in SQL Server 2008
Change Tracking is a light-weight feature which provides a synchronization mechanism between two applications. In other words, it tracks a table for net DML (INSERT, UPDATE and DELETE) changes that occur on a table, so that an application (like a caching application) can refresh itself with just the changed dataset. In this article, I am going to discuss in detail about what Change Tracking is, how it works, how to configure it, an application scenario and how it differs from Change Data Capture. Click here for more details....

Reorganize and Rebuild Index in SQL Server 2005 and 2008

Reorganize and Rebuild Index in SQL Server 2005 and 2008
Once you have identified the high fragmentation level in your database, which could be a bottleneck in your SQL Server performance, what is the next step of fixing this high fragmentation. In this article, I am going to discuss the different methods and its feasibility to fix the identified high fragmentation levels by Reorganize and Rebuild, click here for more details....

Wednesday, September 30, 2009

Filtered Indexes in SQL Server 2008

SQL Server 2008 introduces Filtered Indexes which is an index with a WHERE clause. Doesn’t it sound awesome especially for a table that has huge amount of data and you often select only a subset of that data? For example, you have a lot of NULL values in a column and you want to retrieve records with only non-NULL values or in another scenario you have several categories of data in a particular column, but you often retrieve data only for a particular category value. This article talks about what a Filtered Index is, how it differs from other indexes, its usage scenario, its benefits and limitations.

Executing dynamic SQL scripts on remote SQL Server with EXEC AT statement

With SQL Server 2000, we had OPENQUERY and OPENROWSET to execute a pass-through query on the specified server, but it has several inherent limitations. Starting with SQL Server 2005 we have another more elegant way using "EXEC AT" to execute a pass-through query on the specified linked server which also addresses several shortcomings of OPENQUERY and OPENROWSET table functions. This article talks about this new way of executing dynamic SQL scripts on the remote server in SQL Server 2005 and 2008.

Tuesday, August 25, 2009

Sending HTML formatted email from SSIS

Send Mail Task which is quite simple to use and can be used in a scenario where you need to send plain text email with less development efforts. But you can use, Script Task to overcome the limitations imposed by the Send Mail Task. Click here to see what code you need to write to send HTML formatted mails from SSIS package.

Sending email from SSIS Package

Sending email from SSIS Package
SSIS provides a built-in "Send Mail Task" to send email. The Send Mail Task is quite simple and straight forward in its configuration and use. Click here to learn how to configure and use Send Mail Task in your SSIS Package.

Resource Governor in SQL Server 2008

Resource Governor in SQL Server 2008
Resource Governor is a new technology in SQL Server 2008 that enables you to manage SQL Server workloads and resources by specifying limits on resource consumption by incoming requests. In an environment where multiple distinct workloads are present on the same server, Resource Governor enables us to differentiate these workloads and allocate shared resources as they are requested, based on the limits that you specify. These resources are CPU and memory. Click here for more details on Resource Governor.

Sunday, March 22, 2009

Identifying fragmentation level in SQL Server 2005 and 2008

Identifying fragmentation level in SQL Server 2005 and 2008

While indexes can speed up execution of queries several fold as they can make the querying process faster, there is overhead associated with them. They consume additional disk space and require additional time to update themselves whenever data is updated, deleted or appended in a table. Also when you perform any data modification operations (INSERT, UPDATE, or DELETE statements) index fragmentation may occur and the information in the index can get scattered in the database. Fragmented index data can cause SQL Server to perform unnecessary data reads and switching across different pages, so query performance against a heavily fragmented table can be very poor.
Refer this link to learn more details about fragmentation and different queries to determine the level of fragmentation.

http://www.mssqltips.com/tip.asp?tip=1708

MERGE SQL Statement in SQL Server 2008

In a typical data warehousing application, quite often during the ETL cycle you need to perform INSERT, UPDATE and DELETE operations on a TARGET table by matching the records from the SOURCE table. For example, a products dimension table has information about the products; you need to sync-up this table with the latest information about the products from the source table. You would need to write separate INSERT, UPDATE and DELETE statements to refresh the target table with an updated product list or do lookups. Though it seems to be straight forward at first glance, but it becomes cumbersome when you have do it very often or on multiple tables, even the performance degrades significantly with this approach. With the new MERGE SQL statement in SQL Server 2008 you can perform all these operations in one pass.
Refer this link to learn more details about it.

Debugging T-SQL in SQL Server 2008 SSMS

If you recall your days working with SQL Server 2000, you would remember debugging a routine (Stored Procedure, UDF and trigger) in Query Analyzer, as a debugger tool was available with it. Starting with SQL Server 2005, Query Analyzer and Enterprise Manager had been clubbed together as SQL Server Management Studio (SSMS). Though this single interface has simplified working with SQL Server, one major drawback was, it does not allow you to debug a routine from there. For that purpose you needed Visual Studio (Enterprise and Professional) edition installed, on your development machine, which allowed you to debug a routine. The requirement to install Visual Studio is something that database developers and DBAs would be reluctant to do as it requires additional funds for a Visual Studio license and puts additional pressure on the physical box after installation. Thankfully Microsoft SQL Server team decided to provide this feature in SQL Server 2008 SSMS.

Refer this link to learn more details about it.

Migration Strategies for SQL Server 2008

Migration Strategies for SQL Server 2008

SQL Server 2008 delivers a powerful set of capabilities to solve the growing needs of managing data in the enterprise, on desktops, and on mobile devices, it also builds on the strong momentum in the business intelligence market by providing a scalable infrastructure that enables information technology to drive business intelligence throughout the organization and deliver intelligence where users want it. SQL Server 2008 also delivers improved performance in many areas, including data warehousing, reporting, and analytics. So if you make the decision to upgrade to 2008, there are a number of tools that make the process easier, but you still need to understand what things you should consider.
Refer this link to learn more details about it.

Tuesday, March 17, 2009

VSTA support for Script Task and Script Component in SSIS 2008

VSTA support for Script Task and Script Component in SSIS 2008

Unlike SQL Server 2005, SQL Server 2008 provides VSTA (Visual Studio Tools for Applications) environment for writing Script Task and Script Component instead of VSA (Visual Studio for Applications) environment. VSTA includes all the standard features of the Visual Studio environment, such as the color-coded Visual Studio editor, IntelliSense, and Object Browser and debugging features like breakpoints, watch/auto/locals windows and many more.

Refer this link to learn more details about it.

http://www.sql-server-performance.com/articles/biz/SSIS_New_Features_in_SQL_Server_2008_Part5_p1.aspx

Pipeline Performance Improvements in SSIS 2008


In SQL Server 2008 SSIS, the data flow task has been redesigned to do dynamic scheduling and can now execute multiple components in parallel, even if they belong to the same execution tree. In other words, several threads can work together to do the work that a single thread is forced to do by itself in SQL Server 2005 SSIS. This can give you several-fold speedup in ETL performance.
Refer this link to learn more details about it.

Data Profiling task in SSIS 2008


In SQL Server 2008, SSIS introduces the Data Profiling task in its toolbox, which provides data profiling functionality inside the process of extracting, transforming, and loading data. By using the Data Profiling task, you can analyze the source data more effectively, understand the source data better, and prevent data quality problems before they are introduced into the data warehouse.

Refer this link for more details.

Friday, March 13, 2009

Lookup Transformations in SSIS 2008

Lookup Transformations in SSIS 2008

Lookup transformation in SSIS 2008 has been improved to allow explicit control over the lookup data, a new breed of connection manager viz. Cache Connection Manager has been introduced to store cache to file and share cached lookup data among different components and packages and finally the more intuitive UI for designing Lookup transformation.
Refer this link for more details.

Lookup Transformation in SSIS 2005 and 2008

The Lookup transformation performs lookups by joining data in input columns with columns in a reference dataset/table. If there is no matching entry in the reference dataset, no join occurs and no values are returned from the reference dataset. This is an error, and the transformation fails, unless it is configured to ignore errors or redirect error rows to the error output. If there are multiple matches in the reference table, the lookup returns only the first match based on the lookup query.
I have written articles covering Lookup Tranformation in SSIS 2005 and Lookup Transformation in SSIS 2008 in details including different caching mechanism and detail usage example.

Refer this link for more details.

Sunday, March 8, 2009

SSIS Parallel Processing

Parallel execution improves the performance on the computers that have multiple physical or logical processors. To support parallel execution of different tasks in the package, SSIS uses two properties: MaxConcurrentExecutables and EngineThreads.
In my next article on SSIS Parallel processing I will cover how you can utilize the parallel processing capabilities of SSIS. In the later part of this article, I will provide some tips for SSIS Performance optimization and finally I will talk of what you need to take care of when executing your SSIS Package on 64-bit computers.
Refer this link for more details.

SSIS Buffer Management

SSIS Buffer Management
Data flow engine requires buffer to store incoming data from source, do the necessary transformation in-memory if any, and upload it in the destination. The creation, allocation and management of buffer are done by SSIS Buffer Manager.

I have written an article covering all aspects of SSIS buffer management, including how buffers are allocated and de-allocated for the transformation, different kind of buffer related performance counters etc.

Refer this link for more details.
http://www.sql-server-performance.com/articles/biz/SSIS_An_Inside_View_Part_3_p1.aspx

SSIS Transformation and Execution Tree

SSIS Transformation and Execution Tree

There are two main concepts related to SSIS internals which need to be understood before we deep dive in optimizing SSIS packages.
Transformation - There are different kinds of tranformation in SSIS which overall impacts the performance of SSIS.
Execution Tree - At run time, the data flow engine breaks down Data Flow task operations into execution trees. Execution trees are enormously valuable in understanding buffer usage.

I have written an article covering different kinds of transformation and how data flow task operations are divided into execution trees.

Refer this link for more details.

http://www.sql-server-performance.com/articles/biz/SSIS_An_Inside_View_Part_2_p1.aspx

SSIS Architecture

SSIS Architecture
SSIS is a component of SQL Server 2005/2008 and is successor of DTS (Data Transformation Services) which had been in SQL Server 7.0/2000. Though from end-user perspective DTS and SSIS looks similar to each to some extent, it is not the case in actual. SSIS has been completely written from the scratch (it’s a new enterprise ETL product altogether) and hence it overcomes the several limitations of DTS.

I have written an article covering SSIS architecure and how it is different from DTS. Understanding these things will let you understand internal of SSIS or how SSIS actually works.

Refer this link for more details.
http://www.sql-server-performance.com/articles/biz/SSIS_An_Inside_View_Part_1_p1.aspx

Tuesday, January 27, 2009

SQL Server Integration Services (SSIS) - Validation And DelayValidation

Validation And DelayValidation are one of the most confusing part of SSIS and hence widely misunderstood by SSIS developer. Though understanding these concepts are very essentials to understand what happens before runtime or pipline engine start executing the package.
I have written an article covering all aspects of SSIS Validation And DelayValidation processes, starting with what validation is, why it all required, what are different kinds of validation done by runtime engine or pipeline engine and finally explaining all these concepts with examples.
Refer this link for more details.

SQL Server Integration Services (SSIS) - Checkpoint Restart-ability

Checkpoint Restart-ability

Normally ETL operations are very complex in nature and time consuming process as it often deals with millions of records while pulling it or doing transformation on it. Now even if a package fails in middle of its execution; on next execution it will again start from the begining, repeating all the tasks completed in last run.
SSIS provides Checkpoint restart functionality which simplify the recoverability of packages that contain complex operations and can provide significant time saving because the package does not need to reprocess all of the tasks prior to the checkpoint and will start from last point of failure.
Refer this link for more details.

SQL Server Integration Services (SSIS) - Transaction Support

By Default every component in a SSIS package executes in its own transaction. Using Transaction Support feature of SSIS you can group two or more components in a single group and let all those components inside that group to execute in single transaction.
I have written an article covering all aspects of SSIS Transaction Support functionality, starting with how to enable a transaction on a group of components, what are different options available, execution behavior in these available options, examples of Transaction support, its configuration and finally best practices to follow while using this feature.
Refer this link for more details.

SQL Server Integration Services (SSIS) - Event Handlers

Like any other event driven programming language, SSIS package and its components also generate events in their execution life-cycle. You can extend package functionality by writting event handlers for these events and can make package management easier during runtime.
I have written an article covering all aspects of SSIS Event Handler functionality, starting with what Event Handler is, examples of some events, it usage scenarios, its configuration and finally best practices to follow while using this feature.
Refer this link for more details.

Sunday, January 25, 2009

SQL Server Integration Services (SSIS) - Event Logging

While troubleshooting your SSIS package or tuning your SSIS package for performance you need to know what SSIS Runtime Engine and SSIS Pipeline Engine are doing under the hood. For that purpose, SSIS provides Event Logging feature which traces the execution of SSIS Package and its components during its execution life-cycle. I have written an article covering all aspects of SSIS Event Logging, starting with what Event Logging is, it usage scenarios, its configuration and finally best practices to follow while using this feature.
Refer this link for more details.

Wednesday, January 21, 2009

SQL Server 2008 : T-SQL Enhancements

SQL Server 2008 has come up with different compelling new features. One of them is T-SQL Enhancements which increase the productivity of developers by reducing overall development time and also improves the performance of SQL Server.

I have written an article covering few of the
T-SQL Enhancements, for example
· Intellisense Enhancements
· Syntax Enhancements
· Object Dependencies Enhancements
· Using the FORCESEEK Table Hint
· GROUPING SETS

Refer this link for more details.
http://www.sqlservercentral.com/articles/SQL+Server+2008/65539/

In the next article, I will be writing about new data-types introduced in SQL Server 2008.

Wednesday, January 7, 2009

Introduction of SQL Server Integration Services - A Tutorial article

SQL Server Integration Services (SSIS) is a platform for building high performance data integration and workflow solutions. It allows creating packages or rather SSIS packages which are made up of tasks that can move data from source to destination and if necessary alter it on the way. Not only this, it can be used for several other purposes for example, to automate maintenance of SQL Server databases, update multidimensional cube data etc as well.

In this article of 4 parts, I have discussed about SSIS at introductory level in details.

Part 1 - In this part, discussion is about the SSIS, its origin or brief history, SSIS Package and its components, different ways to create SSIS packages. Refer the link below for more details.
http://www.sql-server-performance.com/articles/biz/SSIS_Introduction_Part1_p1.aspx

Part 2 – In this part, discussion is about the Data Flow Task in SSIS, different components which makes of data flow task, for example Transformation, Data Source Adapters, Data Destination Adapters and Data Paths. Refer the link below for more details.
http://www.sql-server-performance.com/articles/biz/SSIS_Introduction_Part2_p1.aspx

Part 3 – In this part, discussion is about using Import and Export Wizard and SSIS Designer. At last we will go through some examples of creating SSIS packages. Refer the link below for more details.
http://www.sql-server-performance.com/articles/biz/SSIS_Introduction_Part3_p1.aspx

Part 4 – In this part, discussion is about SSIS API Programming. We will create and execute a simple package programmatically using SSIS API Object Model.
http://www.sql-server-performance.com/articles/biz/SSIS_Introduction_Part4_p1.aspxIn the next article,

I am going to write more about the features and properties of SSIS for example Even Logging, Event Handlers, Transaction Support, Checkpoint Restart-ability and SSIS validation process. So stay tuned to see the power and capabilities of SSIS.