Wednesday, June 25, 2014

Windows Azure SQL Database Backup and Restore strategy

Microsoft offers Windows Azure SQL Database as a cloud based SQL Server service offering. Though it has built-in features to protect data from server, network and device failures (reducing overhead for setting up and ensuring high availability) in certain cases you still need to protect your data from application or users errors. In this tip we cover how to setup a backup and restore strategy for a Windows Azure SQL Database. For more information click here.

Creating Power View Reports in Excel 2013

In my last tip, I talked about Power View, a new tool for intuitive ad hoc reporting which provides an interactive data exploration, visualization, and presentation experience to the business users. I demonstrated how you can configure and create Power View reports in SharePoint. In this tip we cover how to use Power View for Excel, click here. 

Getting started with Power View Reports with SharePoint, Excel and SQL Server

Microsoft Business Intelligence stack provides several tools for analyzing data from different perspectives, in different format and different ways. Power View is a new addition to this arsenal which provides intuitive ad hoc reporting capabilities for business users to provide an interactive data exploration, visualization, and presentation experience. But how does it work and how to get started with it? For information click here.

Getting started with the AMR tool for migration to SQL Server In-memory OLTP Tables

SQL Server 2014 introduces In-memory OLTP to create memory optimized tables for OLTP workloads to significantly improve performance and reduce processing time when you have a significant amount of memory and multi-core processors. But how do you decide which tables or stored procedures might benefit if migrated to In-memory OLTP and how much estimated work is required to migrate the objects? For more information click here.

SQL Server 2014 In Memory OLTP Durability, Natively Compiled Stored Procedures and Transaction Isolation Level

SQL Server 2014 will be released with an In-Memory OLTP feature for OLTP workloads to significantly improve performance and reduce processing time when you have a plentiful amount of memory and multi-core processors. In-memory OLTP allows you to create a memory optimized table which resides completely in memory and is resilient if the server crashes. But what about data durability of memory optimized tables? I heard there is new type of stored procedure which improves performance even further when working with memory optimized tables? What about transactions or isolation level support for these types of tables? For more information click here.

Overview of Applications, Indexes and Limitations for SQL Server 2014 In-Memory OLTP Tables

SQL Server 2014's In-Memory OLTP feature is intended to significantly improve performance and reduce processing time for servers with large amounts of memory and multi-core processors. What types of applications can take advantage of this new feature? What are the different types of indexes which can be created on memory optimized tables? Are there any considerations, restrictions or limitations? For information click here.

Getting started with Indexes on SQL Server Memory Optimized Tables

SQL Server 2014 introduces the In-memory OLTP engine which is fully integrated in SQL Server and allows you to create memory optimized tables. Data for these memory optimized tables gets stored in memory in a completely different structure than traditional disk based tables. But what about the indexes? What types of indexes can be created on memory optimized tables and when they can be used? How does this work? For more information click here.

Getting started with SQL Server 2014 In-Memory OLTP

In-Memory OLTP is a new feature in SQL Server 2014 for OLTP workloads to significantly improve performance and reduce processing time when you have a plentiful amount of memory and numerous multi-core processors. But how does someone get started using this feature? What changes need to be made in the database or what scripts need to be run to create memory optimized tables? Also, what are some of the considerations for migration?  For more information click here.

SQL Server 2014 In-Memory OLTP Architecture and Data Storage

Microsoft is releasing SQL Server 2014 with a new set of In-Memory OLTP features to significantly improve the OLTP performance and reduce the processing time for servers with a large amount of memory and multi-core processors. But how is this functionality different from the DBCC PINTABLE command? What are the design principles for this feature? How does the data get stored in this new type of table? How is In-Memory OLTP different from the existing SQL Server disk based tables?  How does it all work?  Check out this tip to learn more. For more information click here.

Backup and Restore with Cloud services (Windows Azure Blob Storage Service) using PowerShell

The two main PowerShell cmdlets introduced in SQL Server 2012 to do backup and restore operations are Backup-SqlDatabase and Restore-SqlDatabase. These cmdlets have been further enhanced and new cmdlets have been introduced in SQL Server 2012 SP1 CU4 to allow backing up your database to Windows Azure Blob storage and restoring from there as needed.  In this tip we look at how to write PowerShell scripts for backing up and restoring a database to Windows Azure Blob storage. For more information click here.

Backup and Restore SQL Server databases with Azure cloud services and SMO

As DBAs, we don't only have the responsibility to backup databases on regular basis, but also to ensure that backup drives and tapes are secure so databases can be restored when needed. I heard that SQL Server 2012 supports backups and restores using a Windows Azure Blob Storage account.  How does this work and how do I get started using SMO? For more information click here.

SQL Server Database Backup and Restore with Windows Azure Blob Storage Cloud Services using T-SQL Commands

SQL Server 2012 introduces a new feature to issues backups on the Windows Azure Blob Storage service directly and restore from there when needed. But how does it work and how to get started using T-SQL commands for backup and restore operation? For more information click here. 

Backup and Restore with Cloud services (Windows Azure Blob Storage Service)

As DBAs, we have the responsibility to backup databases on a regular basis, but we also have to ensure that backups are stored securely and safely, so databases can be restored when needed.  With the current method it is sometimes difficult to control that these backup files are readily accessible when needed.  I heard that SQL Server 2012 now supports backups directly to Windows Azure Blob Storage and restores can also be done directly from Windows Azure Blob Storage.  Can you tell me more about this new feature and advantages of using it over traditional backup options? For more information click here.

Getting started with SQL Data Sync

Windows Azure SQL Database has a feature called SQL Data Sync which synchronizes data across multiple SQL Server and Windows Azure SQL Database instances. SQL Data Sync is a cloud-based data synchronization service built on Microsoft Sync Framework technologies. It provides single direction as well as bi-directional data synchronization and data management capabilities allowing data to be easily shared across Windows Azure SQL Databases across multiple data centers or between on-premises SQL Server databases and Windows Azure SQL databases. For more information refer to below articles:


Column Level Encryption in SQL Server

In my last article, Transparent Data Encryption (TDE) in SQL Server I talked about enabling encryption at the entire database level and in this article I am going to further discuss and demonstrate a more granular level or each individual cell or column level encryption in detail and how it differs from Transparent Data Encryption (TDE). For more information click here.

Transparent Data Encryption (TDE) in SQL Server

Encryption brings data into a state that cannot be interpreted by anyone who does not have access to the decryption key, password, or certificates. Though encryption does not restrict the access to the data, it ensures that if data loss happens, then in that case data is useless to the person who does not have access to the decryption key/password/certificates. Though when you use encryption, there should be a maintenance strategy for passwords, keys, and certificates. 
To meet the demands of regulatory compliance and corporate data security standards, SQL Server allows you to enable encryption at column/cell level or on the database level. You can even use file level encryption provided by Windows for database files.
Though there are several ways/levels to implement encryption in SQL Server, I am going to focus on Transparent Data Encryption (TDE) in this article, which was introduced in SQL Server 2008 and available in later releases. For more information click here.

Getting Started with Microsoft Power BI for Office 365

Microsoft Excel is one of the most widely used tools across the globe. Microsoft empowers Excel users to create Business Intelligence reports quickly in Excel itself with Power BI. Power BI is a familiar, intuitive and cloud based self-service BI (Business Intelligence) solution for all your data needs in your very own Excel, which users have been using for decades. It includes different tools for data discovery, analysis and visualization. Furthermore, Power BI integrates with Office 365 to share, collaborate and much more. For more details click here.

Managing SQL Server Services with PowerShell

PowerShell provides a command-line shell and scripting language (built in the .NET Framework) especially designed for administrative task automation and configuration management. PowerShell includes commands (called cmdlets) to administer SQL Server along with having cmdlets for administering other Microsoft server products. This gives administrators a common scripting language across servers’ platforms. For more details click here.

Tuesday, June 24, 2014

Getting Starting with Change Data Capture (CDC) in SQL Server

Prior to SQL Server 2008 there was no in-built support to identify changed data set for incrementally pulling data from a source table and hence we had to write our own custom logic (for example by adding Last Created Date or Last Modified Date columns in the source table and updating it accordingly or by some other means) so that changed data sets can be identified for incremental data pull. Starting with SQL Server 2008 we have two different in-built mechanisms (please note, you don’t need to write code for leveraging these features though you just need to enable it accordingly as per your need) to identify DML changes (INSERT, UPDATE, DELTE) happening at the source table so that only changed data sets can be considered for data pull from the source table and to load into the data warehouse called Change Data Capture (CDC) and Change Tracking (CT). Please refer this article to learn more about Change Data Capture (CDC).