Skip to content

Commit 985e2e8

Browse files
authored
Merge pull request #17778 from MicrosoftDocs/master
11/03 Publishing
2 parents f888ac9 + 59f4988 commit 985e2e8

15 files changed

+164
-89
lines changed

docs/azure-data-studio/preview-features.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -72,7 +72,7 @@ You can enable or disable preview features anytime in your Settings.
7272
### First-party extensions in preview
7373

7474
* Admin Pack for SQL Server
75-
* Azure SQL Data Warehouse Insights
75+
* Azure Synapse Analytics Insights
7676
* Central Management Servers
7777
* Database Administration Tool Extensions for Windows
7878
* Kusto

docs/includes/applies-to-version/sql-asdb-asdbmi-asa-sql-poolonly-pdw.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,4 +6,4 @@ ms.date: 06/11/2020
66
ms.author: mikeray
77
---
88

9-
[!INCLUDE [Applies to](../../includes/applies-md.md)] [!INCLUDE [SQL Server](_ssnoversion.md)] [!INCLUDE [Azure SQL Database](../../includes/applies-to-version/_asdb.md)] [!INCLUDE [SQL Managed Instance](../../includes/applies-to-version/_asdbmi.md)] [!INCLUDE [Azure Synapse Analytics (SQL pool only)](../../includes/applies-to-version/_asa-sqlpool-only.md)] [!INCLUDE [Parallel Data Warehouse](../../includes/applies-to-version/_pdw.md)]
9+
[!INCLUDE [Applies to](../../includes/applies-md.md)] [!INCLUDE [SQL Server](_ssnoversion.md)] [!INCLUDE [Azure SQL Database](../../includes/applies-to-version/_asdb.md)] [!INCLUDE [SQL Managed Instance](../../includes/applies-to-version/_asdbmi.md)] [!INCLUDE [Azure Synapse Analytics (dedicated SQL pool only)](../../includes/applies-to-version/_asa-sqlpool-only.md)] [!INCLUDE [Parallel Data Warehouse](../../includes/applies-to-version/_pdw.md)]

docs/includes/ssazuresynapse_sqlpool_only.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,4 +6,4 @@ ms.date: 06/11/2020
66
ms.author: mikeray
77
---
88

9-
Azure Synapse Analytics (SQL pool only)
9+
Azure Synapse Analytics (dedicated SQL pool only)

docs/includes/sssodfull-md.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
SQL on-demand (preview) in Azure Synapse Analytics
1+
serverless SQL pool (preview) in Azure Synapse Analytics
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,2 @@
11
> [!NOTE]
2-
> This syntax is not supported by SQL on-demand (preview) in Azure Synapse Analytics.
2+
> This syntax is not supported by serverless SQL pool (preview) in Azure Synapse Analytics.

docs/integration-services/load-data-to-sql-data-warehouse.md

Lines changed: 17 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
2-
title: Load data into Azure Synapse Analytics with SQL Server Integration Services (SSIS) | Microsoft Docs
3-
description: Shows you how to create a SQL Server Integration Services (SSIS) package to move data from a wide variety of data sources to Azure Synapse Analytics.
2+
title: Load data into Azure Synapse Analytics with SQL Server Integration Services (SSIS)
3+
description: Shows you how to create a SQL Server Integration Services (SSIS) package to move data from a wide variety of data sources into a dedicated SQL pool in Azure Synapse Analytics.
44
documentationcenter: NA
55
ms.prod: sql
66
ms.prod_service: "integration-services"
@@ -11,13 +11,11 @@ ms.date: 08/09/2018
1111
ms.author: chugu
1212
author: chugugrace
1313
---
14-
# Load data into Azure Synapse Analytics with SQL Server Integration Services (SSIS)
14+
# Load data into a dedicated SQL pool in Azure Synapse Analytics with SQL Server Integration Services (SSIS)
1515

16-
[!INCLUDE[sqlserver-ssis](../includes/applies-to-version/sqlserver-ssis.md)]
16+
[!INCLUDE[asa](../includes/applies-to-version/asa.md)]
1717

18-
19-
20-
Create a SQL Server Integration Services (SSIS) package to load data into [Azure Synapse Analytics](/azure/sql-data-warehouse/index). You can optionally restructure, transform, and cleanse the data as it passes through the SSIS data flow.
18+
Create a SQL Server Integration Services (SSIS) package to load data into a dedicated SQL pool in Azure Synapse Analytics](/azure/sql-data-warehouse/index). You can optionally restructure, transform, and cleanse the data as it passes through the SSIS data flow.
2119

2220
This article shows you how to do the following things:
2321

@@ -35,8 +33,8 @@ A detailed introduction to SSIS is beyond the scope of this article. To learn mo
3533

3634
- [How to Create an ETL Package](ssis-how-to-create-an-etl-package.md)
3735

38-
## Options for loading data into SQL Data Warehouse with SSIS
39-
SQL Server Integration Services (SSIS) is a flexible set of tools that provides a variety of options for connecting to, and loading data into, SQL Data Warehouse.
36+
## Options for loading data into Azure Synapse Analytics with SSIS
37+
SQL Server Integration Services (SSIS) is a flexible set of tools that provides a variety of options for connecting to, and loading data into, Azure Synapse Analytics.
4038

4139
1. The preferred method, which provides the best performance, is to create a package that uses the [Azure SQL DW Upload Task](control-flow/azure-sql-dw-upload-task.md) to load the data. This task encapsulates both source and destination information. It assumes that your source data is stored locally in delimited text files.
4240

@@ -48,7 +46,7 @@ To step through this tutorial, you need the following things:
4846
1. **SQL Server Integration Services (SSIS)**. SSIS is a component of SQL Server and requires a licensed version, or the developer or evaluation version, of SQL Server. To get an evaluation version of SQL Server, see [Evaluate SQL Server](https://www.microsoft.com/evalcenter/evaluate-sql-server-2017-rtm).
4947
2. **Visual Studio** (optional). To get the free Visual Studio Community Edition, see [Visual Studio Community][Visual Studio Community]. If you don't want to install Visual Studio, you can install SQL Server Data Tools (SSDT) only. SSDT installs a version of Visual Studio with limited functionality.
5048
3. **SQL Server Data Tools for Visual Studio (SSDT)**. To get SQL Server Data Tools for Visual Studio, see [Download SQL Server Data Tools (SSDT)][Download SQL Server Data Tools (SSDT)].
51-
4. **An Azure Synapse Analytics database and permissions**. This tutorial connects to a SQL Data Warehouse instance and loads data into it. You have to have permission to connect, to create a table, and to load data.
49+
4. **An Azure Synapse Analytics database and permissions**. This tutorial connects to a dedicated SQL pool in Azure Synapse Analytics instance and loads data into it. You have to have permission to connect, to create a table, and to load data.
5250

5351
## Create a new Integration Services project
5452
1. Launch Visual Studio.
@@ -74,7 +72,7 @@ To continue the tutorial with this option, you need the following things:
7472

7573
- The [Microsoft SQL Server Integration Services Feature Pack for Azure][Microsoft SQL Server 2017 Integration Services Feature Pack for Azure]. The SQL DW Upload task is a component of the Feature Pack.
7674

77-
- An [Azure Blob Storage](/azure/storage/) account. The SQL DW Upload task loads data from Azure Blob Storage into Azure Synapse Analytics. You can load files that are already in Blob Storage, or you can load files from your computer. If you select files on your computer, the SQL DW Upload task uploads them to Blob Storage first for staging, and then loads them into SQL Data Warehouse.
75+
- An [Azure Blob Storage](https://docs.microsoft.com/azure/storage/) account. The SQL DW Upload task loads data from Azure Blob Storage into Azure Synapse Analytics. You can load files that are already in Blob Storage, or you can load files from your computer. If you select files on your computer, the SQL DW Upload task uploads them to Blob Storage first for staging, and then loads them into your dedicated SQL pool.
7876

7977
### Add and configure the SQL DW Upload Task
8078

@@ -92,25 +90,25 @@ For more control, you can manually create a package that emulates the work done
9290

9391
1. Use the Azure Blob Upload Task to stage the data in Azure Blob Storage. To get the Azure Blob Upload task, download the [Microsoft SQL Server Integration Services Feature Pack for Azure][Microsoft SQL Server 2017 Integration Services Feature Pack for Azure].
9492

95-
2. Then use the SSIS Execute SQL task to launch a PolyBase script that loads the data into SQL Data Warehouse. For an example that loads data from Azure Blob Storage into SQL Data Warehouse (but not with SSIS), see [Tutorial: Load data to Azure Synapse Analytics](/azure/sql-data-warehouse/load-data-wideworldimportersdw).
93+
2. Then use the SSIS Execute SQL task to launch a PolyBase script that loads the data into your dedicated SQL pool. For an example that loads data from Azure Blob Storage into dedicated SQL pool (but not with SSIS), see [Tutorial: Load data to Azure Synapse Analytics](/azure/sql-data-warehouse/load-data-wideworldimportersdw).
9694

9795
## Option 2 - Use a source and destination
9896

9997
The second approach is a typical package which uses a Data Flow task that contains a source and a destination. This approach supports a wide range of data sources, including SQL Server and Azure Synapse Analytics.
10098

10199
This tutorial uses SQL Server as the data source. SQL Server runs on premises or on an Azure virtual machine.
102100

103-
To connect to SQL Server and to SQL Data Warehouse, you can use an ADO.NET connection manager and source and destination, or an OLE DB connection manager and source and destination. This tutorial uses ADO.NET because it has the fewest configuration options. OLE DB may provide slightly better performance than ADO.NET.
101+
To connect to SQL Server and to a dedicated SQL pool, you can use an ADO.NET connection manager and source and destination, or an OLE DB connection manager and source and destination. This tutorial uses ADO.NET because it has the fewest configuration options. OLE DB may provide slightly better performance than ADO.NET.
104102

105103
As a shortcut, you can use the SQL Server Import and Export Wizard to create the basic package. Then, save the package, and open it in Visual Studio or SSDT to view and customize it. For more info, see [Import and Export Data with the SQL Server Import and Export Wizard](import-export-data/import-and-export-data-with-the-sql-server-import-and-export-wizard.md).
106104

107105
### Prerequisites for Option 2
108106

109107
To continue the tutorial with this option, you need the following things:
110108

111-
1. **Sample data**. This tutorial uses sample data stored in SQL Server in the AdventureWorks sample database as the source data to be loaded into SQL Data Warehouse. To get the AdventureWorks sample database, see [AdventureWorks Sample Databases][AdventureWorks 2014 Sample Databases].
109+
1. **Sample data**. This tutorial uses sample data stored in SQL Server in the AdventureWorks sample database as the source data to be loaded into a dedicated SQL pool. To get the AdventureWorks sample database, see [AdventureWorks Sample Databases][AdventureWorks 2014 Sample Databases].
112110

113-
2. **A firewall rule**. You have to create a firewall rule on SQL Data Warehouse with the IP address of your local computer before you can upload data to the SQL Data Warehouse.
111+
2. **A firewall rule**. You have to create a firewall rule on your dedicated SQL pool with the IP address of your local computer before you can upload data to the dedicated SQL pool.
114112

115113
### Create the basic data flow
116114
1. Drag a Data Flow Task from the Toolbox to the center of the design surface (on the **Control Flow** tab).
@@ -169,9 +167,9 @@ To continue the tutorial with this option, you need the following things:
169167
3. In the **Configure ADO.NET Connection Manager** dialog box, click the **New** button to open the **Connection Manager** dialog box and create a new data connection.
170168
4. In the **Connection Manager** dialog box, do the following things.
171169
1. For **Provider**, select the SqlClient Data Provider.
172-
2. For **Server name**, enter the SQL Data Warehouse name.
170+
2. For **Server name**, enter the dedicated SQL pool name.
173171
3. In the **Log on to the server** section, select **Use SQL Server authentication** and enter authentication information.
174-
4. In the **Connect to a database** section, select an existing SQL Data Warehouse database.
172+
4. In the **Connect to a database** section, select an existing dedicated SQL pool database.
175173
5. Click **Test Connection**.
176174
6. In the dialog box that reports the results of the connection test, click **OK** to return to the **Connection Manager** dialog box.
177175
7. In the **Connection Manager** dialog box, click **OK** to return to the **Configure ADO.NET Connection Manager** dialog box.
@@ -182,8 +180,8 @@ To continue the tutorial with this option, you need the following things:
182180
7. In the **Create Table** dialog box, do the following things.
183181

184182
1. Change the name of the destination table to **SalesOrderDetail**.
185-
2. Remove the **rowguid** column. The **uniqueidentifier** data type is not supported in SQL Data Warehouse.
186-
3. Change the data type of the **LineTotal** column to **money**. The **decimal** data type is not supported in SQL Data Warehouse. For info about supported data types, see [CREATE TABLE (Azure Synapse Analytics, Parallel Data Warehouse)][CREATE TABLE (Azure Synapse Analytics, Parallel Data Warehouse)].
183+
2. Remove the **rowguid** column. The **uniqueidentifier** data type is not supported in dedicated SQL pool.
184+
3. Change the data type of the **LineTotal** column to **money**. The **decimal** data type is not supported in dedicated SQL pool. For info about supported data types, see [CREATE TABLE (Azure Synapse Analytics, Parallel Data Warehouse)][CREATE TABLE (Azure Synapse Analytics, Parallel Data Warehouse)].
187185

188186
![Screenshot of the Create Table dialog box, with code to create a table named SalesOrderDetail with LineTotal as a money column and no rowguid column.][12b]
189187
4. Click **OK** to create the table and return to the **ADO.NET Destination Editor**.

docs/integration-services/what-s-new-in-integration-services-in-sql-server-2016.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -305,7 +305,7 @@ The latest version of the Azure Feature Pack includes a connection manager, sour
305305

306306
#### <a name="sqldwupload"></a> Support for Azure Synapse Analytics released
307307

308-
The latest version of the Azure Feature Pack includes the Azure SQL DW Upload task for populating SQL Data Warehouse with data. For more info, see [Azure Feature Pack for Integration Services &#40;SSIS&#41;](../integration-services/azure-feature-pack-for-integration-services-ssis.md)
308+
The latest version of the Azure Feature Pack includes the Azure SQL DW Upload task for populating Azure Synapse Analytics with data. For more info, see [Azure Feature Pack for Integration Services &#40;SSIS&#41;](../integration-services/azure-feature-pack-for-integration-services-ssis.md)
309309

310310
## Usability and productivity
311311

docs/relational-databases/system-dynamic-management-views/sys-dm-pdw-exec-requests-transact-sql.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -35,11 +35,11 @@ monikerRange: ">= aps-pdw-2016 || = azure-sqldw-latest || = sqlallproducts-allve
3535
|database_id|**int**|Identifier of database used by explicit context (for example, USE DB_X).|See ID in [sys.databases &#40;Transact-SQL&#41;](../../relational-databases/system-catalog-views/sys-databases-transact-sql.md).|
3636
|command|**nvarchar(4000)**|Holds the full text of the request as submitted by the user.|Any valid query or request text. Queries that are longer than 4000 bytes are truncated.|
3737
|resource_class|**nvarchar(20)**|The workload group used for this request. |Static Resource Classes</br>staticrc10</br>staticrc20</br>staticrc30</br>staticrc40</br>staticrc50</br>staticrc60</br>staticrc70</br>staticrc80</br> </br>Dynamic Resource Classes</br>SmallRC</br>MediumRC</br>LargeRC</br>XLargeRC|
38-
|importance|**nvarchar(128)**|The Importance setting the request executed at. This is the relative importance of a request in this workload group and across workload groups for shared resources. Importance specified in the classifier overrides the workload group importance setting.</br>Applies to: Azure SQL Data Warehouse|NULL</br>low</br>below_normal</br>normal (default)</br>above_normal</br>high|
39-
|group_name|**sysname** |For requests utilizing resources, group_name is the name of the workload group the request is running under. If the request does not utilize resources, group_name is null.</br>Applies to: Azure SQL Data Warehouse|
38+
|importance|**nvarchar(128)**|The Importance setting the request executed at. This is the relative importance of a request in this workload group and across workload groups for shared resources. Importance specified in the classifier overrides the workload group importance setting.</br>Applies to: Azure Synapse Analytics|NULL</br>low</br>below_normal</br>normal (default)</br>above_normal</br>high|
39+
|group_name|**sysname** |For requests utilizing resources, group_name is the name of the workload group the request is running under. If the request does not utilize resources, group_name is null.</br>Applies to: Azure Synapse Analytics|
4040
|classifier_name|**sysname**|For requests utilizing resources, The name of the classifier used for assigning resources and importance.||
41-
|resource_allocation_percentage|**decimal(5,2)**|The percentage amount of resources allocated to the request.</br>Applies to: Azure SQL Data Warehouse|
42-
|result_cache_hit|**int**|Details whether a completed query used result set cache. </br>Applies to: Azure SQL Data Warehouse| 1 = Result set cache hit </br> 0 = Result set cache miss </br> Negative integer values = Reasons why result set caching was not used. See remarks section for details.|
41+
|resource_allocation_percentage|**decimal(5,2)**|The percentage amount of resources allocated to the request.</br>Applies to: Azure Synapse Analytics|
42+
|result_cache_hit|**int**|Details whether a completed query used result set cache. </br>Applies to: Azure Synapse Analytics| 1 = Result set cache hit </br> 0 = Result set cache miss </br> Negative integer values = Reasons why result set caching was not used. See remarks section for details.|
4343
|client_correlation_id|**nvarchar(255)**|Optional user-defined name for a client session. To set for a session, call sp_set_session_context 'client_correlation_id', '<CorrelationIDName>'. Run `SELECT SESSION_CONTEXT(N'client_correlation_id')` to retrieve its value.|
4444
||||
4545

@@ -75,4 +75,4 @@ The negative integer value in the result_cache_hit column is a bitmap value of a
7575
7676
## See Also
7777

78-
[SQL Data Warehouse and Parallel Data Warehouse Dynamic Management Views &#40;Transact-SQL&#41;](../../relational-databases/system-dynamic-management-views/sql-and-parallel-data-warehouse-dynamic-management-views.md)
78+
[Azure Synapse Analytics and Parallel Data Warehouse Dynamic Management Views &#40;Transact-SQL&#41;](../../relational-databases/system-dynamic-management-views/sql-and-parallel-data-warehouse-dynamic-management-views.md)

docs/relational-databases/system-tables/backupset-transact-sql.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,7 @@ monikerRange: ">=aps-pdw-2016||>=sql-server-2016||=sqlallproducts-allversions||>
5050
|**software_major_version**|**tinyint**|[!INCLUDE[msCoName](../../includes/msconame-md.md)] [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] major version number. Can be NULL.|
5151
|**software_minor_version**|**tinyint**|[!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] minor version number. Can be NULL.|
5252
|**software_build_version**|**smallint**|[!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] build number. Can be NULL.|
53-
|**time_zone**|**smallint**|Difference between local time (where the backup operation is taking place) and Coordinated Universal Time (UTC) in 15-minute intervals at the time the backup operation started. Values can be -48 through +48, inclusive. A value of 127 indicates unknown. For example, -20 is Eastern Standard Time (EST) or five hours after UTC. Can be NULL.|
53+
|**time_zone**|**smallint**|Difference between local time (where the backup operation is taking place) and Coordinated Universal Time (UTC) in 15-minute intervals using the time zone information at the time the backup operation started. Values can be -48 through +48, inclusive. A value of 127 indicates unknown. For example, -20 is Eastern Standard Time (EST) or five hours after UTC. Can be NULL.|
5454
|**mtf_minor_version**|**tinyint**|[!INCLUDE[msCoName](../../includes/msconame-md.md)] Tape Format minor version number. Can be NULL.|
5555
|**first_lsn**|**numeric(25,0)**|Log sequence number of the first or oldest log record in the backup set. Can be NULL.|
5656
|**last_lsn**|**numeric(25,0)**|Log sequence number of the next log record after the backup set. Can be NULL.|

0 commit comments

Comments
 (0)