1. SQL Server 2005
I. Integration Services Development Enhancements
1) New Integration Services Architecture from DTS to SSIS by two distinct engines: run-time engine and data-flow engine
The
previous architecture of SQL Server Data Transformation Services (DTS) combined
data transformation, tasks, and package control flow into a single component.
This made the creation of complex packages difficult. In SQL Server 2005, the
Integration Services architecture separates data flow from control flow by
introducing two distinct engines—the Integration Services run-time engine and
the Integration Services data flow engine. This separation provides better
control of package execution, increases the visibility of data transformations,
and enhances the extensibility of Integration Services by simplifying the
creation and implementation of custom tasks and transformations.
For
more information, see Integration Services
Architecture.
a) Integration Services Run-Time Engine
The
Integration Services run-time engine stores package layout, executes packages,
controls workflow between tasks, and provides run-time services such as
debugging, logging, event handling, and management of connections, variables,
and transactions.
For
more information, see Control Flow Elements.
b) Integration Services Data Flow Engine
The
Integration Services data flow engine satisfies the needs of enterprises whose
extraction, transformation, and loading (ETL) processes require fast, flexible,
extensible, and dependable data movement. It is optimized for high-performance
data movement and transformation.
The
data flow engine supports multiple sources, multiple transformations, and
multiple destinations in one fast, flexible data flow. Integration Services
includes more than 25 transformations and more than 10 sources and destinations
for use in data flows. The Data Flow task, which represents the Integration
Services data flow engine graphically in SSIS Designer, replaces various DTS
data-oriented tasks, such as the Data Transformation task and the Data Driven
Query task.
2) Extensible Object Model – integrated with .NET framework
In
addition to the separation of data flow and control flow, the entire underlying
Integration Services object model has been reengineered with extensibility in
mind. A powerful integrated development environment (IDE) is combined with
support for the Microsoft .NET Framework to accelerate the creation of powerful
custom Integration Services tasks, transformations, and data adapters.
The
custom extensions that can be developed include tasks, log providers,
enumerators, connection managers, and data flow components. These custom
objects can be integrated into the user interface of Business Intelligence
Development Studio.
The
Integration Services run-time API and data flow API let developers extend and
customize almost every aspect of the object model. New or existing Integration
Services packages can be loaded, modified, and executed programmatically,
giving developers the ability to fully automate package maintenance and
execution.
For
more information, see Integration Services
Programming.
For
some examples of custom Integration Services development projects, see Programming Samples.
3) New Integration Services Designer – New features for design
The new SSIS Designer
provides an integrated developer experience for designing, creating, testing,
and debugging Integration Services packages. The user interface of SSIS
Designer lets you build and configure packages by using drag-and-drop methods
and by selecting options in dialog boxes for each package object.
SSIS Designer includes
these features:
- Separate design surfaces for package control flow, data
flow, and event handlers.
- A hierarchical view of package content in Solution
Explorer.
- Containers that can be expanded and collapsed for
grouping related tasks, allowing easier viewing, organization, and
management of package layout.
- Annotations that make package control flow, data flows,
and event handlers self-documenting.
- A connections area for adding connection managers to
packages and for referencing data source objects.
- Dialog boxes for adding custom variables, configuring
logging, creating configurations, and signing packages with digital
signatures.
- Debugging tools that provide the ability to set
breakpoints on packages, containers, and task events, and data viewers for
watching data as it moves through the data flow.
- A progress window that lists the start time of a
package and its tasks, their execution state, and any warnings and error
messages.
- A graphical representation of package execution including the progress, precedence, and execution outcome of individual tasks, containers, and data flow components.
4) New Development Environment - BIDS
SQL
Server 2005 introduces Business Intelligence Development Studio for building
data transformation solutions, and SQL Server Management Studio for managing
Integration Services packages. Business Intelligence Development Studio hosts
SSIS Designer, the graphical tool for creating Integration Services packages,
and provides all the powerful features of the Microsoft Visual Studio
development environment to the Integration Services package developer.
Business
Intelligence Development Studio integrates seamlessly with the Visual Studio
for Applications (VSA) environment, in which the developer writes scripts for
the Script task and the Script component. When debugging a package that
includes both breakpoints on package events that you set in SSIS Designer, and
breakpoints on lines of script code in a Script task that you set in VSA, the
package can run to and from breakpoints in the package and the script.
5) Enhanced SQL Server Import and Export Wizard
The
redesign of the SQL Server Import and Export Wizard puts the focus on copying
data. This wizard is the simplest way to quickly create Integration Services
packages that copy data between two data stores.
The
SQL Server Import and Export Wizard includes many new features, including
better support for data in flat files and real-time preview of data. Saved
packages created by using the SQL Server Import and Export Wizard can be opened
in Business Intelligence Development Studio and extended by using SSIS
Designer.
6) New Workflow Features
Many
new workflow features, including containers for looping, event handlers, and
enhanced precedence constraints, give the package developer more precise
control over package execution.
Workflow
containers provide structure to packages and services to tasks. They support
repeating control flows in packages, and they group tasks and containers into
meaningful units of work. Integration Services provides the following new
containers:
- The Sequence
container, for grouping tasks and other workflow structures into a unit of
work that can be managed as one item. For more information, seeSequence
Container.
- The For Loop
container, for grouping tasks and other workflow structures into a unit of
work that is repeated by evaluating an expression. For more information,
see For
Loop Container.
- The Foreach Loop container, for grouping tasks and other
workflow structures into a unit of work that is repeated by enumerating
objects. For more information, see Foreach
Loop Container.
The
Integration Services object model supports a nested hierarchy of containers in
a package control flow. Workflow containers can also include other containers,
providing support for complex package workflow.
7) New Integration Services Tasks
New
and enhanced tasks make it simple to build packages that solve complex business
problems by using only the tasks that Integration Services provides.
a) Workflow Tasks
Integration
Services includes a number of tasks that perform workflow operations, such as
executing other packages, running applications, and sending e-mail messages.
Integration Services also includes these new workflow tasks:
- The WMI Data
Reader task, for querying Windows Management Instrumentation (WMI) data.
- The WMI Event
Watcher task, for listening to WMI events.
b) Data Preparation Tasks
Several
other new tasks provide data preparation functionality, such as uploading,
downloading, and copying files. New data preparation tasks include the
following:
- The File System
task, for performing operations on files and folders in the file system.
- The Web Service
task, for accessing Web services.
- The XML task,
for working with XML documents.
c) Business Intelligence Tasks
Integration
Services now includes a number of tasks that perform business intelligence
operations, such as processing analytic objects, running scripts written in the
Analysis Services data definition language (DDL), and executing data mining
prediction queries. Integration Services includes these new business
intelligence tasks:
- The Analysis
Services Execute DDL task for running DDL scripts.
- The Data Mining
Query task for querying data mining models.
d) Scripting Tasks
Integration
Services includes two tasks that can contain scripts: the Script task, and the
ActiveX Script task for legacy support. These scripts run in the package
workflow to extend package functionality beyond the functionality provided by
the built-in Integration Services tasks.
8) New Integration Services Data Sources and Destinations
In SQL Server 2005,
Integration Services packages can extract data from many additional types of
data sources. To access these data sources, Integration Services provides the
following new sources and destinations in addition to the SQL Server, OLE DB,
and flat file sources and destinations:
- SQL Server 2005 Compact Edition destination, for
inserting and updating data into SQL Server 2005 Compact Edition databases.
- DataReader source and destination, for consuming and providing
data to any .NET Framework data provider.
- XML source, for exacting data from XML documents.
- Raw File source and destination, for reading and
writing raw data to files.
- Recordset destination, for creating and populating an in-memory
ADODB recordset.
- Data Mining Model Training, Data Mining Query,
Partition Processing, and Dimension Processing destinations, for working
with analytic objects such as mining models, cubes, and dimensions.
Integration Services
also includes the Script Component for simplified development of custom sources
and destinations.
9) New Integration Services Data Transformations
The Integration
Services data flow engine supports data flows that have multiple sources,
multiple transformations, and multiple destinations. New transformations make
it easy for developers to build packages that have complex data flow without
writing any code. These transformations include the following:
- Conditional Split and Multicast transformation, for
distributing data rows to multiple downstream data flow components.
- Union All, Merge, and Merge Join transformations, for
combining data rows from multiple upstream data flow components.
- Sort transformation, for sorting data and identifying
duplicate data rows, and the Fuzzy Grouping transformation for identifying
similar data rows.
- Lookup and Fuzzy Lookup transformations, for extending
data with values from a lookup table.
- Term Extraction and Term Lookup transformations, for
text mining applications.
- Aggregate, Pivot, Unpivot,
and Slowly Changing Dimension transformations, for common data warehousing
tasks.
- Percentage Sampling and Row Sampling transformations,
for extracting a sample rowset.
- Copy Column, Data Conversion, and Derived Column
transformations, for copying and modifying column values, and the
Aggregate transformation for summarizing data.
- Pivot and Unpivot
transformations, for creating normalized data rows from non-normalized
data and vice versa.
Integration Services
also includes the Script Component for simplified development of custom
transformations.
II. Integration Services Management Enhancements
1) New Package Storage and Management Tools
a) New Integration Services Service
The
Integration Services service is a Microsoft Windows service that manages
package storage and displays a hierarchical view of saved packages in SQL
Server Management Studio. The service supports packages stored in the msdb database
in an instance of SQL Server or in the file system.
For
more information about the new Integration Services service, see Integration
Services Service and Managing
Integration Services Packages.
b) New Package Management Utility
The dtutil command-line
utility manages packages from the command prompt. You can perform tasks such as
copying, deleting, moving, and signing packages that are stored in the msdb database,
in an instance of SQL Server, or in the file system.
2) New Package Monitoring and Troubleshooting Tools
a) New Running Packages List
The
Integration Services service displays a list of running packages in SQL Server
Management Studio.
For
more information, see Integration
Services Service.
b) New Package Logging Options
Integration
Services includes rich logging features, including multiple logging providers,
a logging schema from which you can choose the type of information to log, and
a flexible logging model that supports logging configuration at the package
level and the task level.
For
more information, see Logging
Package Execution.
c) New Package Restart Capability
A
package that fails during execution can now be restarted from a point other
than the very beginning of the package. The package can now include checkpoints
that let you restart the package from the failed task instead of having to
rerun the whole package. By using checkpoints, a package can avoid
unnecessarily repeating these lengthy operations:
- The download
and upload of large files.
- The loading of
large amounts of data, such as a bulk insert.
- The processing
of analytic objects.
When
the failed package is rerun, Integration Services uses a checkpoint file to
determine the location from which to restart the package.
For
more information, see Using
Checkpoints in Packages.
d) New Performance Monitoring Features
Integration
Services now includes a set of performance counters for monitoring the
performance of the data flow engine.
3) New Package Deployment Tools
a) New Package Configuration Features
Configurations
allow you to dynamically update the properties of package objects, and move
packages from development to testing to production environments more easily.
Integration Services includes the Package Configuration Wizard to assist with
configuration management.
For
more information, see Package
Configurations.
b) New Package Deployment Utility
Business
Intelligence Development Studio includes the ability to create a deployment
utility for Integration Services packages. During the process of creating the
deployment utility, Business Intelligence Development Studio automatically
detects and includes all package dependencies such as the package
configurations, making it an ideal mechanism for deploying packages in a
production environment. When you run the Integration Services package
deployment utility, you can install packages to the msdb database
in an instance of SQL Server 2005 or to the file system.
For
more information about deployment see Deployment
of Packages.
4) New Package Security Features
a) New Integration Services Roles
Administrators
can manage access to Integration Services packages by using SQL Server roles
for packages stored in the msdb database
in an instance of SQL Server. SQL Server 2005 includes a set of preconfigured
server roles for Integration Services.
For
more information, see Integration
Services Roles as below:
Windows administrators –
Can Stop
all currently running packages.
db_dtsadmin (db_ssisadmin
in SSIS 2008+) or sysadmin– can
·
Import packages.
·
Delete own packages.
·
Delete all packages.
·
Change own package roles.
·
Change all package roles.
db_dtsltduser (db_ssisltduser
in SSIS 2008+)
·
Import packages.
·
Delete own packages.
·
Change own package roles.
db_dtsoperator (db_ssisoperator
in SSIS 2008+) –
can Enumerate all packages. View all packages. Execute all packages.
Export all packages. Execute all packages in SQL Server Agent. But None on
Write
b) New Package Encryption Options
Integration
Services packages can be encrypted with various levels of encryption to protect
sensitive data.
For
more information, see Setting the Protection Level of Packages.
c) New Digital Signatures for Packages
Integration
Services packages can be digitally signed.
5) Legacy Support Features
SQL Server 2005 includes a set
of tools and features that support managing, editing, running, and migrating
Data Transformation Service (DTS) packages from earlier versions of SQL Server.
SQL Server Management Studio lists available DTS packages and provides options
to modify or execute them. Integration Services also includes the Package
Migration Wizard to assist in migrating DTS packages to the Integration
Services format.
2. SSIS 2008
I. Installation Features
1) New Samples Location
Books
Online no longer includes SQL Server sample databases and sample applications.
These sample databases and sample applications are now available on the SQL Server Samples Web
site. This Web site not only makes it easier for users to find these samples,
but to find additional samples that are related to Microsoft SQL Server and
Business Intelligence. On the SQL Server Samples Web site, you can do the
following actions:
·
Browse through samples contributed by developers, users, and the
Microsoft Most Valuable Professional (MVP) community.
·
Download both sample databases and code projects.
·
View or participate in a discussion area where you can report
issues and ask questions about the samples for each technology area.
2) Support for SQL Server 2000 Data Transformation Services (DTS)
Integration
Services still supports SQL Server 2000 Data Transformation Services (DTS). For
more information, see Support for Data
Transformation Services (DTS) in SQL Server 2008.
II. Component Enhancements
1) Enhanced Performance and Caching for the Lookup Transformation
Performance
enhancements to the Lookup transformation include faster cache loading and more
efficient lookup operations. These enhancements are possible because of the
following features:
·
The ability to take rows that do not have matching entries in
the reference dataset and load those rows into the cache.
·
The ability to use separate data flows to load the reference
dataset into the cache and to perform lookups on the reference dataset.
The
Lookup transformation now includes the following caching options:
·
The reference dataset is a cache file (.caw), and is accessed by
using a Cache connection manager.
·
The reference dataset is a connected data source in the data
flow, and is accessed by using a Cache connection manager and a Cache Transform
transformation.
·
The reference dataset is a table, view, or query that is fully
or partially cached and accessed by using an OLE DB connection manager.
·
The cache can be shared between multiple Lookup transformations
in a single package and between transformations in separate packages. For more
information, see How to: Implement a Lookup
Transformation in Full Cache Mode Using the Cache Connection Manager.
·
You can deploy a cache file together with a package. For more
information, see How to: Create and Deploy a
Cache for the Lookup Transformation.
For
more information, see Lookup Transformation, Cache Connection Manager, and Cache Transform.
For a
video that demonstrates how to configure the Lookup Transformation, see the
video home page, How to: Implement a Lookup
Transformation in Full Cache Mode (SQL Server Video), in
the MSDN Library.
2) New ADO.NET Components – new ADO NET Source and ADO NET Destination
Integration
Services now includes the following ADO.NET components:
·
An ADO NET source component that consumes data from a .NET
Framework provider and makes the data available to the data flow. For more
information, see ADO NET Source.
·
An ADO NET destination component that loads data into a variety
of ADO.NET-compliant databases that use a database table or view. For more
information, see ADO NET Destination.
3) New Data Profiling Task and Data Profile Viewer
The
Data Profiling task is a new task in the Integration Services toolbox. You can
use this task inside an Integration Services package to profile data that
is stored in SQL Server. The information provided by the profile helps you
identify potential problems with data quality. The Data Profiling task provides
profiles that help identify data quality problems within individual columns and
with column relationships:
Profiles that help
identify problems within individual columns
·
The distribution of lengths in the column values.
·
The percentage of null values.
·
The distribution of values in the column.
·
Column statistics for numeric columns.
·
Regular expressions that match string columns.
Profiles that help
identify problems with column relationships
·
Candidate key columns.
·
Functional dependencies between columns.
·
The inclusion of the set of values in one column in the set of
values in another column.
For
more information, see Data Profiling Task and Profiling Data with the
Data Profiling Task and Viewer.
For a
video that demonstrates how to use the Data Profiling task, see the video home
page, How to: Use the Data
Profiling Task (SQL Server Video), in the MSDN Library.
4) New Integration Services Connections Project Wizard – the new Import and Export Wizard
Use
the Integration Services Connections Project Wizard to create a package that
contains the connection information that you need to connect data sources and
destinations. The wizard guides you through the steps of selecting data
providers, configuring connection managers, and assigning connection managers
to sources and destinations. For more information about when to use this
wizard, see Creating a Package by Running a
Wizard.
5) New Script Environment - Visual Studio Tools for Applications (VSTA) replaces Visual Studio for Applications (VSA)
Business
Intelligence Development Studio now integrates seamlessly with
Microsoft Visual Studio Tools for Applications (VSTA) environment. VSTA is
the development environment in which a developer writes scripts for the Script
task and the Script component.
VSTA
supports both the Microsoft Visual Basic 2008 or Microsoft Visual C# 2008 programming languages.
VSTA also enables you to add managed assemblies to a script at design time by
browsing to the folder location. In addition, VSTA enables you to add a Web
reference in your code that enables the code to use objects and methods
provided by a Web service.
For
SQL Server 2005 Integration Services (SSIS) packages that include
Microsoft Visual Studio for Applications (VSA) scripts, VSTA
converts those scripts.
For
more information about VSTA in Integration Services, see the following topics:
·
Referencing Other
Assemblies in Scripting Solutions
6) Package Upgrade
You
can upgrade your Integration Services packages from the format that Integration
Services uses in SQL Server 2005 to the format that SQL Server 2008 uses. To
upgrade your SQL Server 2005 packages, do one or more of the following
procedures:
·
Use the dtexec command
prompt utility (dtexec.exe) that comes with SQL Server 2008 to run the SQL
Server 2005 package. When you use this method to run a SQL Server 2005
package, the upgrade is temporary, and the changes that result from the upgrade
cannot be saved.
·
Add the SQL Server 2005 package to an existing project or
open that package in SQL Server 2008 Integration Services.
Integration Services will automatically upgrade the package. However, the
upgrade is temporary. To
permanently upgrade the package, you must save the package changes.
·
Create or open a SQL Server 2005 Integration Services
project, and then use the SSIS
Package Upgrade Wizard to upgrade all the packages in the project. This
upgrade process is permanent.
For
more information, see Upgrading Integration
Services Packages.
7) Package Configurations – dtexec applies package configurations twice.
When
you use the dtexec command
prompt utility (dtexec.exe) to run a deployed package, Integration
Services now applies package configurations twice. Integration Services
applies the configuration both before and after applying the options that you
specified at the dtexec command
line.
For
more information, see Package Configurations.
III. Data Management Enhancements
1) Enhanced Data Type Handling in the SQL Server Import and Export Wizard
The
SQL Server Import and Export Wizard now provides additional information and options
related to the date type conversions that the import or export operation
requires:
·
You can view data type mapping information for each table or
view that you select to import or export. This information includes a visual
indication of the probability that the conversions will succeed without error.
·
You can view additional detailed information for any column in
the selected table or view.
·
You can accept or reject the data type conversions that the
wizard will perform on a column by column basis.
·
You can specify the handling of errors and truncations on a
global or a column by column basis.
For
more information, see Review Data Type Mapping
(SQL Server Import and Export Wizard).
2) New Date and Time Data Types
The
following new date and time data types are available in Integration Services:
·
DT_DBTIME2
·
DT_DBTIMESTAMP2
·
DT_DBTIMESTAMPOFFSET
These
new Integration Services data types provide the following benefits:
·
Support for a larger scale for fractional seconds.
·
Support of user-defined precision.
·
Support for a time zone offset.
Various
control flow and data flow elements support these new data types. For more
information, see Control Flow Elements and Data Flow Elements.
You
can convert the new data types to other Integration Services date data types by
using expressions, the Data Conversion transformation, and the Derived Column
transformation. You can also use expressions to perform comparisons between the
new data types. For more information, see Integration Services
Data Types and Cast (SSIS Expressions).
3) Enhanced SQL Statements
Integration
Services includes the following enhancements to Transact-SQL statements:
1. Perform multiple data manipulation language (DML) operations - Merge
Transact-SQL supports the use
of a MERGE operation in an SQL statement. The MERGE operation enables you to
express multiple INSERT, UPDATE, and DELETE operations in a single statement
against a specified target table. The target table is based on join conditions
with a source table. For more information, see Inserting, Updating, and
Deleting Data by Using MERGE and Using MERGE in
Integration Services Packages.
2. Retrieve data about changes to a data source – Output and Insert
The INSERT operation supports
inserting rows into a target table that are returned by the OUTPUT clause of an
INSERT, UPDATE, DELETE, or MERGE operation. For more information, see INSERT (Transact-SQL).
3. Improve the performance of the bulk load operation when the data is sorted according to the clustered index on the table
The BULK option of the
OPENROWSET function supports the ORDER argument that specifies how the data in
the data file is already sorted. The ORDER argument does not effect a sort
operation on the text data. This argument is a hint to the SQL Server Database
Engine that the data is already pre-sorted in the file. If the data is not
sorted, the Database Engine returns an error.
The OPENROWSET function
enables you to use OLE DB to access remote data. For more information, see OPENROWSET
(Transact-SQL).
IV. Performance and Troubleshooting Enhancements
1) Change Data Capture
Integration
Services can use a new Database Engine feature called change data capture.
This new feature captures insert, update, and delete activity that is applied
to SQL Server tables. Change data capture also makes the details of these
changes available in an easily consumed relational format.
For
information on using change data capture in an Integration
Services package to perform incremental loads, see Improving Incremental
Loads with Change Data Capture. For the latest samples of
how to use change data capture in Integration Services packages, see the SQL Server Samples Web
site.
2) New Debug Dump Files
You
can create debug dump files (.mdmp and .tmp) that provide information about what happens when a
package runs. This information can help you in troubleshooting issues that
occur when you run the package.
To
create the debug dump files, you use certain command prompt options with the dtexec utility
and the dtutil command
prompt utility (dtutil.exe). For more information, see dtexec Utility and dtutil Utility.
For
more information about the type of information included in the debug dump file
(.tmp), see Working with Debug Dump
Files.
3. SSIS 2008R2
ADO
NET Destination Editor (Connection Manager Page) has a new option - Use bulk insert when available
This release includes a new option, Use
bulk insert when available, in the ADO NET Destination Editor. This option
enables you to use the System.Data.SqlClient.SqlBulkCopy interface
to improve the performance of bulk insert operations.
4. SSIS 2012 – Project Deployment Model, Parameter, and SSISDB Catalog
I. Development Enhancements
1) Project Connection Managers
This release allows you to
create connection managers at the project level that can shared by multiple
packages in the project. The connection manager you create at the project level
is automatically visible in the Connection Managers tab of
the SSIS
Designer window
for all packages. For details, see Add, Delete, or Share a
Connection Manager in a Package.
3) Offline Connection Managers
Integration Services now
validates all connections before validating all of the data flow components
when a package is opened and sets any connections that are slow or unavailable
to work offline. This helps to reduce the delay in validating the package data
flow. After a package is
opened, you can also turn off a connection by right-clicking the
connection manager in the Connection Managers area
and then clicking Work Offline. This can speed up operations
in the SSIS Designer.
For more information, see Troubleshooting Tools
for Package Development.
4) Flat File Connection Manager Changes
The Flat File connection
manager now supports parsing files with embedded qualifiers. The connection
manager also by default always checks for row delimiters to enable the correct
parsing of files with rows that are missing column fields
For more information, see Flat File Connection
Manager.
5) Parameters - to assign values to properties within packages at the time of package execution
Integration Services (SSIS)
parameters allow you to assign values to properties within packages at the time
of package execution. You can create project parameters at the project level
and package parameters at the package level. Project parameters are used to
supply any external input the project receives to one or more packages in the
project. Package parameters allow you to modify package execution without
having to edit and redeploy the package. For more information, see Integration Services
(SSIS) Parameters.
6) Execute Package Task and Parameters – Now two ways to pass values to a child package: package configuration and parameters!
In this release, you can
configure the Execute Package task to make values available to the child
package by mapping variables or parameters to child package parameters. For
more information, see the Passing Values to Child Packages section
of Execute Package Task.
7) Comparing and Merging Packages - The structure of package .dtsx files has been modified to make it easier.
The structure of package .dtsx files has been modified to make it easier for you to
compare packages. You can also more reliably merge packages that don’t contain
conflicting changes or changes stored in binary format.
For more information, see SSIS Package Format.
8) Undo/Redo in SSIS Designer
You can undo and redo up to 20
actions in the SSIS Designer. For a package, undo/redo is available in the Control Flow, Data Flow, Event Handlers, and Parameters tabs,
and in the Variables window.
For a project, undo/redo is available in the Project Parameters window.
For details, see SSIS Designer.
9) Column Mapping- Columns are re-mapped based on their name and data type rather than by lineage ID
When an error occurs due to a
column mapping problem in the data flow, such as missing columns, the error
icon appears on the path or above the component if no path is connected. In addition,
you can still edit the component when there is a mapping problem.
The SSIS Designer is now
smarter about re-mapping columns when a new data source is connected. Columns
are re-mapped based on their name and data type rather than by lineage ID.
You can more easily resolve
issues with disconnected input paths and unmapped columns in an execution tree,
by using the Resolve Column Reference
Editor.
10) Script Task and Script Component – can target the .NET Framework 4.5
If you create a package by
using SQL Server Data Tools - Business Intelligence for Visual Studio 2012 that
includes a Script task or Script component, you can configure the script
project to target the .NET Framework 4.5 class library. You set the Target framework property
in Microsoft Visual Studio Tools for Applications (VSTA).
You can install SQL Server Data Tools - Business
Intelligence for Visual Studio 2012 side by side with SQL Server Data Tools (SSDT)
that is included in SQL Server 2012.
Note: Script tasks and Script components that target the .NET
Framework 4.5 class library, cannot be edited in SQL Server Data Tools (SSDT)
that is included in SQL Server 2012. The workaround is to load the package in
SQL Server Data Tools - Business Intelligence for Visual Studio 2012 and change
the script project to target the .NET Framework 4.0 class library.
It is
recommended that if you plan to edit the package in both SQL Server Data Tools
- Business Intelligence for Visual Studio 2012 and SQL Server Data Tools (SSDT)
that is included in SQL Server 2012, to configure the script project to target
the .NET Framework 4.0 class library.
II. Data Quality
1) DQS Cleansing Transformation – new DQS Cleansing transformation
Integration Services now
includes the DQS Cleansing transformation that enables you to more easily and
accurately improve the quality of data.
For more information about
improving data quality using these components, see DQS Cleansing
Transformation.
III. Management and Troubleshooting
1) Server Environments
Server environments are a new
concept in this release of Integration Services. You use an environment to
specify runtime values for packages contained in a project you’ve deployed to
the Integration Services server. These variable values are mapped to the
project parameters.
For more information, see Create and Map a Server
Environment and Deployment of Projects
and Packages.
2) SSISDB Catalog
The SSISDB catalog
is the central point for working with Integration Services (SSIS) projects that
you’ve deployed to the Integration Services server. For example, you set
project and package parameters, configure environments to specify runtime
values for packages, execute and troubleshoot packages, and manage a variety of
Integration Services server operations.
For more information, see the
following topics.
·
Backup, Restore, and
Move the SSIS Catalog
3) Troubleshooting Performance and Data Issues
This release provides
additional views, stored procedures, and stored functions to help you
troubleshoot performance and data issues.
Troubleshooting Capability |
Related Views, Procedures, and Functions |
Get performance statistics and other information for an
execution |
·
catalog.executions (SSISDB Database) ·
catalog.execution_component_phases |
Add, remove, and query data taps in a package data flow |
|
Create a dump for a running package |
|
Set a parameter value in an instance of an execution
and view parameter values that are used by a package during an instance of
execution. |
4) Reports for Troubleshooting Package Operations
In this release, standard
reports are available in SQL Server Management Studio to help you troubleshoot
Integration Services packages that have been deployed to the SSISDB catalog.
These operations reports help you to view package execution status and identify
the cause of execution failures.
For more information, see Reports for the
Integration Services Server.
IV. Performance
1) Reduced Memory Usage by the Merge and Merge Join Transformations
Microsoft has made the
Integration Services Merge and Merge Join transformations more robust and
reliable. This is achieved by reducing the risk that these components will
consume excessive memory when the multiple inputs produce data at uneven rates.
This improvement helps packages that use the Merge or Merge Join
transformations to use memory more efficiently.
Microsoft has also provided
new properties and methods for developers of custom data flow components to
implement a similar solution in their own components. This improvement makes it
more feasible to develop a robust custom data flow component that supports
multiple inputs. For more information, see Developing Data Flow
Components with Multiple Inputs.
V. Deployment- deploy your projects to the Integration Services server
In the
current release of Integration Services, you can deploy your projects to the
Integration Services server. This is the new project deployment model. The
Integration Services server enables you to manage packages, run packages, and
configure runtime values for packages by using environments.
The
following topics provide information about converting projects to the project deployment model
and deploying projects, and the Integration Services server.
·
Deployment of Projects
and Packages
·
Deploy Projects to
Integration Services Server
·
Integration Services
(SSIS) Server
VI. Access to Samples and Tutorials
1) Getting Started Window
The Getting Started window
in the SSIS Designer provides links to samples, tutorials and videos. You can
customize the window by adding links to additional content. For more
information, see Integration Services
User Interface.
2) SSIS Toolbox
The SSIS Toolbox in SQL
Server Data Tools (SSDT) provides links to samples and Help content for Control
Flow and Data Flow items. You can add links for custom components. For more
information, see Developing Custom
Objects for Integration Services.
5. SSIS 2014 – No Change
6. SSIS 2016
I. AlwaysOn support - Add the SSISDB database (just the same as any other user database) to an AlwaysOn Availability Group
The AlwaysOn Availability Groups feature is a high-availability
and disaster-recovery solution that provides an enterprise-level alternative to
database mirroring. An availability group supports a failover environment
for a discrete set of user databases, known as availability databases that fail
over together. For more information, please see AlwaysOn Availability Groups.
In SQL
Server 2016, SQL Server Integration Services (SSIS) introduces new capabilities
that allow you to easily deploy to a centralized SSIS Catalog (i.e. SSISDB user
database). In order to provide the high-availability for the SSISDB database
and its contents (projects, packages, execution logs, etc.), you can add the SSISDB database
(just the same as any other user database) to an AlwaysOn
Availability Group. When a failover occurs, one of the secondary nodes
automatically becomes the new primary node.
Configure SSIS support for AlwaysOn
1) Step 1: Create Integration Services Catalog
o
Launch SQL
Server Management Studio and
connect to a SQL Server instance in the cluster that you want to set as the primary node of AlwaysOn high
availability group for SSISDB.
o
In Object Explorer, expand the server node, right-click the Integration Services
Catalogs node,
and then click Create
Catalog.
o
Click Enable
CLR Integration. The catalog uses CLR stored procedures.
o
Click Enable
automatic execution of Integration Services stored procedure at SQL Server
startup to
enable the catalog.startup stored procedure to run each time the
SSIS server instance is restarted. The stored procedure performs maintenance of
the state of operations for the SSISDB catalog. It fixes the status of any
packages there were running if and when the SSIS server instance goes down.
o
Enter a password, and
then click Ok. The
password protects the database master key that is used for encrypting the
catalog data. Save the password in a secure location. It is recommended that
you also back up the database master key.
2) Step 2: Add SSISDB to an AlwaysOn
Availability Group
o
Adding the SSISDB database to an AlwaysOn Availability Group is almost same as adding any
other user database into an availability group. That
is:
§ In
Object Explorer, connect to the server instance that hosts the primary replica.
§ Expand
the AlwaysOn High Availability node and the Availability Groups node.
§ To
launch the New Availability Group Wizard, select the New Availability Group Wizard command.
§ The
first time you run this wizard, an Introduction page appears. To bypass this page in the
future, you can click Do not show this page
again. After reading this page, click Next.
§ On the Specify Availability Group Name page, enter the name of the new
availability group in the Availability group name field. This name must be a valid SQL
Server identifier that is unique on the WSFC failover cluster and in your
domain as a whole. The maximum length for an availability group name is 128
characters.
§ On the Select Databases page, the grid lists user databases on the
connected server instance that are eligible to become the availability databases.
Select one or more of the listed databases to participate in the new
availability group. These databases will initially be the initial primary databases.
§ On the Specify Replicas page, specify and configure one or more
replicas for the new availability group.
§ …..
3) Step 3: Enable SSIS support for AlwaysOn
o
After you create the
Integration Service Catalog, right click the Integration Service
Catalogs node, and clickEnable AlwaysOn Support…. You should see an Enable
Support for AlwaysOn dialog box. If this
menu item is disabled, confirm that you have all the prerequisites installed
and click Refresh.
For
detailed overview and step-by-step instructions for enabling AlwaysOn for SSISDB, see AlwaysOn for SSIS Catalog (SSISDB).
II. Incremental package deployment
The
Incremental Package Deployment feature allows you to deploy one or more
packages to an existing or new project without deploying the whole project. You
can incrementally deploy packages using: Deployment Wizard, SQL Server Management
Studio (uses Deployment Wizard), stored procedures, and Management Object Model
(MOM) API at this time. See Deploy Packages to
Integration Services Server for
details.
a. Project Upgrade
When
you upgrade your SSIS projects from previous to the current version, the
project-level connection managers will continue to work as usual and the
package layout/annotations are retained.