Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[mgmt] datafactory release #28461

Merged
merged 2 commits into from
Feb 23, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
46 changes: 38 additions & 8 deletions sdk/datafactory/arm-datafactory/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,45 @@
# Release History

## 14.0.0 (2024-02-04)

**Features**

## 13.0.1 (Unreleased)

### Features Added

### Breaking Changes

### Bugs Fixed
- Added Interface SnowflakeV2Dataset
- Added Interface SnowflakeV2LinkedService
- Added Interface SnowflakeV2Sink
- Added Interface SnowflakeV2Source
- Added Interface WarehouseLinkedService
- Added Interface WarehouseSink
- Added Interface WarehouseSource
- Added Interface WarehouseTableDataset
- Added Type Alias SnowflakeAuthenticationType
- Interface SalesforceServiceCloudV2LinkedService has a new optional parameter authenticationType
- Interface SalesforceServiceCloudV2Source has a new optional parameter includeDeletedObjects
- Interface SalesforceV2LinkedService has a new optional parameter authenticationType
- Interface SalesforceV2Source has a new optional parameter includeDeletedObjects
- Type of parameter type of interface CopySink has two new values "WarehouseSink" | "SnowflakeV2Sink"
- Type of parameter type of interface CopySource has two new values "WarehouseSource" | "SnowflakeV2Source"
- Type of parameter type of interface Dataset has two new values "SnowflakeV2Table" | "WarehouseTable"
- Type of parameter type of interface LinkedService has two new values "SnowflakeV2" | "Warehouse"
- Type of parameter type of interface TabularSource has a new value "WarehouseSource"
- Added Enum KnownSnowflakeAuthenticationType

### Other Changes
**Breaking Changes**

- Interface SalesforceServiceCloudV2Source no longer has parameter readBehavior
- Interface SalesforceV2Source no longer has parameter readBehavior
- Type of parameter headers of interface AzureFunctionActivity is changed from any to {
[propertyName: string]: string;
}
- Type of parameter headers of interface WebActivity is changed from any to {
[propertyName: string]: string;
}
- Type of parameter headers of interface WebHookActivity is changed from any to {
[propertyName: string]: string;
}
- Removed Enum KnownSalesforceV2SourceReadBehavior


## 13.0.0 (2023-12-28)

**Features**
Expand Down
2 changes: 1 addition & 1 deletion sdk/datafactory/arm-datafactory/LICENSE
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
The MIT License (MIT)

Copyright (c) 2023 Microsoft
Copyright (c) 2024 Microsoft

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
Expand Down
6 changes: 3 additions & 3 deletions sdk/datafactory/arm-datafactory/_meta.json
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
{
"commit": "4792bce7667477529991457890b4a6b670e70508",
"commit": "45f5b5a166c75a878d0f5404e74bd1855ff48894",
"readme": "specification/datafactory/resource-manager/readme.md",
"autorest_command": "autorest --version=3.9.7 --typescript --modelerfour.lenient-model-deduplication --azure-arm --head-as-boolean=true --license-header=MICROSOFT_MIT_NO_VERSION --generate-test --typescript-sdks-folder=D:\\Git\\azure-sdk-for-js ..\\azure-rest-api-specs\\specification\\datafactory\\resource-manager\\readme.md --use=@autorest/[email protected].13 --generate-sample=true",
"autorest_command": "autorest --version=3.9.7 --typescript --modelerfour.lenient-model-deduplication --azure-arm --head-as-boolean=true --license-header=MICROSOFT_MIT_NO_VERSION --generate-test --typescript-sdks-folder=D:\\Git\\azure-sdk-for-js ..\\azure-rest-api-specs\\specification\\datafactory\\resource-manager\\readme.md --use=@autorest/[email protected].14 --generate-sample=true",
"repository_url": "https://github.com/Azure/azure-rest-api-specs.git",
"release_tool": "@azure-tools/[email protected]",
"use": "@autorest/[email protected].13"
"use": "@autorest/[email protected].14"
}
2 changes: 1 addition & 1 deletion sdk/datafactory/arm-datafactory/assets.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,5 +2,5 @@
"AssetsRepo": "Azure/azure-sdk-assets",
"AssetsRepoPrefixPath": "js",
"TagPrefix": "js/datafactory/arm-datafactory",
"Tag": "js/datafactory/arm-datafactory_ecfa0d543b"
"Tag": "js/datafactory/arm-datafactory_826c384657"
}
3 changes: 1 addition & 2 deletions sdk/datafactory/arm-datafactory/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
"sdk-type": "mgmt",
"author": "Microsoft Corporation",
"description": "A generated SDK for DataFactoryManagementClient.",
"version": "13.0.1",
"version": "14.0.0",
"engines": {
"node": ">=18.0.0"
},
Expand Down Expand Up @@ -78,7 +78,6 @@
"pack": "npm pack 2>&1",
"extract-api": "api-extractor run --local",
"lint": "echo skipped",
"audit": "echo skipped",
"clean": "rimraf --glob dist dist-browser dist-esm test-dist temp types *.tgz *.log",
"build:node": "echo skipped",
"build:browser": "echo skipped",
Expand Down
138 changes: 114 additions & 24 deletions sdk/datafactory/arm-datafactory/review/arm-datafactory.api.md

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
// Licensed under the MIT License.
import {
RunFilterParameters,
DataFactoryManagementClient
DataFactoryManagementClient,
} from "@azure/arm-datafactory";
import { DefaultAzureCredential } from "@azure/identity";
import * as dotenv from "dotenv";
Expand All @@ -33,15 +33,15 @@ async function activityRunsQueryByPipelineRun() {
const runId = "2f7fdb90-5df1-4b8e-ac2f-064cfa58202b";
const filterParameters: RunFilterParameters = {
lastUpdatedAfter: new Date("2018-06-16T00:36:44.3345758Z"),
lastUpdatedBefore: new Date("2018-06-16T00:49:48.3686473Z")
lastUpdatedBefore: new Date("2018-06-16T00:49:48.3686473Z"),
};
const credential = new DefaultAzureCredential();
const client = new DataFactoryManagementClient(credential, subscriptionId);
const result = await client.activityRuns.queryByPipelineRun(
resourceGroupName,
factoryName,
runId,
filterParameters
filterParameters,
);
console.log(result);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
// Licensed under the MIT License.
import {
ChangeDataCaptureResource,
DataFactoryManagementClient
DataFactoryManagementClient,
} from "@azure/arm-datafactory";
import { DefaultAzureCredential } from "@azure/identity";
import * as dotenv from "dotenv";
Expand All @@ -32,7 +32,8 @@ async function changeDataCaptureCreate() {
const factoryName = "exampleFactoryName";
const changeDataCaptureName = "exampleChangeDataCapture";
const changeDataCapture: ChangeDataCaptureResource = {
description: "Sample demo change data capture to transfer data from delimited (csv) to Azure SQL Database with automapped and non-automapped mappings.",
description:
"Sample demo change data capture to transfer data from delimited (csv) to Azure SQL Database with automapped and non-automapped mappings.",
allowVNetOverride: false,
sourceConnectionsInfo: [],
targetConnectionsInfo: [],
Expand All @@ -44,7 +45,7 @@ async function changeDataCaptureCreate() {
resourceGroupName,
factoryName,
changeDataCaptureName,
changeDataCapture
changeDataCapture,
);
console.log(result);
}
Expand All @@ -64,7 +65,8 @@ async function changeDataCaptureUpdate() {
const factoryName = "exampleFactoryName";
const changeDataCaptureName = "exampleChangeDataCapture";
const changeDataCapture: ChangeDataCaptureResource = {
description: "Sample demo change data capture to transfer data from delimited (csv) to Azure SQL Database. Updating table mappings.",
description:
"Sample demo change data capture to transfer data from delimited (csv) to Azure SQL Database. Updating table mappings.",
allowVNetOverride: false,
status: "Stopped",
sourceConnectionsInfo: [],
Expand All @@ -77,7 +79,7 @@ async function changeDataCaptureUpdate() {
resourceGroupName,
factoryName,
changeDataCaptureName,
changeDataCapture
changeDataCapture,
);
console.log(result);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ async function changeDataCaptureDelete() {
const result = await client.changeDataCapture.delete(
resourceGroupName,
factoryName,
changeDataCaptureName
changeDataCaptureName,
);
console.log(result);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ async function changeDataCaptureGet() {
const result = await client.changeDataCapture.get(
resourceGroupName,
factoryName,
changeDataCaptureName
changeDataCaptureName,
);
console.log(result);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ async function changeDataCaptureListByFactory() {
const resArray = new Array();
for await (let item of client.changeDataCapture.listByFactory(
resourceGroupName,
factoryName
factoryName,
)) {
resArray.push(item);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ async function changeDataCaptureStart() {
const result = await client.changeDataCapture.start(
resourceGroupName,
factoryName,
changeDataCaptureName
changeDataCaptureName,
);
console.log(result);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ async function changeDataCaptureStart() {
const result = await client.changeDataCapture.status(
resourceGroupName,
factoryName,
changeDataCaptureName
changeDataCaptureName,
);
console.log(result);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ async function changeDataCaptureStop() {
const result = await client.changeDataCapture.stop(
resourceGroupName,
factoryName,
changeDataCaptureName
changeDataCaptureName,
);
console.log(result);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
// Licensed under the MIT License.
import {
ManagedIdentityCredentialResource,
DataFactoryManagementClient
DataFactoryManagementClient,
} from "@azure/arm-datafactory";
import { DefaultAzureCredential } from "@azure/identity";
import * as dotenv from "dotenv";
Expand All @@ -35,16 +35,16 @@ async function credentialsCreate() {
properties: {
type: "ManagedIdentity",
resourceId:
"/subscriptions/12345678-1234-1234-1234-12345678abc/resourcegroups/exampleResourceGroup/providers/Microsoft.ManagedIdentity/userAssignedIdentities/exampleUami"
}
"/subscriptions/12345678-1234-1234-1234-12345678abc/resourcegroups/exampleResourceGroup/providers/Microsoft.ManagedIdentity/userAssignedIdentities/exampleUami",
},
};
const credential = new DefaultAzureCredential();
const client = new DataFactoryManagementClient(credential, subscriptionId);
const result = await client.credentialOperations.createOrUpdate(
resourceGroupName,
factoryName,
credentialName,
credentials
credentials,
);
console.log(result);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ async function credentialsDelete() {
const result = await client.credentialOperations.delete(
resourceGroupName,
factoryName,
credentialName
credentialName,
);
console.log(result);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ async function credentialsGet() {
const result = await client.credentialOperations.get(
resourceGroupName,
factoryName,
credentialName
credentialName,
);
console.log(result);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ async function credentialsListByFactory() {
const resArray = new Array();
for await (let item of client.credentialOperations.listByFactory(
resourceGroupName,
factoryName
factoryName,
)) {
resArray.push(item);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
// Licensed under the MIT License.
import {
DataFlowDebugPackage,
DataFactoryManagementClient
DataFactoryManagementClient,
} from "@azure/arm-datafactory";
import { DefaultAzureCredential } from "@azure/identity";
import * as dotenv from "dotenv";
Expand Down Expand Up @@ -43,12 +43,12 @@ async function dataFlowDebugSessionAddDataFlow() {
name: "source1",
dataset: {
type: "DatasetReference",
referenceName: "DelimitedText2"
}
}
referenceName: "DelimitedText2",
},
},
],
transformations: []
}
transformations: [],
},
},
datasets: [
{
Expand All @@ -62,24 +62,24 @@ async function dataFlowDebugSessionAddDataFlow() {
firstRowAsHeader: true,
linkedServiceName: {
type: "LinkedServiceReference",
referenceName: "linkedService5"
referenceName: "linkedService5",
},
location: {
type: "AzureBlobStorageLocation",
container: "dataflow-sample-data",
fileName: "Ansiencoding.csv"
fileName: "Ansiencoding.csv",
},
quoteChar: '"'
}
}
quoteChar: '"',
},
},
],
debugSettings: {
datasetParameters: { Movies: { path: "abc" }, Output: { time: "def" } },
parameters: { sourcePath: "Toy" },
sourceSettings: [
{ rowLimit: 1000, sourceName: "source1" },
{ rowLimit: 222, sourceName: "source2" }
]
{ rowLimit: 222, sourceName: "source2" },
],
},
linkedServices: [
{
Expand All @@ -89,18 +89,18 @@ async function dataFlowDebugSessionAddDataFlow() {
annotations: [],
connectionString:
"DefaultEndpointsProtocol=https;AccountName=<storageName>;EndpointSuffix=core.windows.net;",
encryptedCredential: "<credential>"
}
}
encryptedCredential: "<credential>",
},
},
],
sessionId: "f06ed247-9d07-49b2-b05e-2cb4a2fc871e"
sessionId: "f06ed247-9d07-49b2-b05e-2cb4a2fc871e",
};
const credential = new DefaultAzureCredential();
const client = new DataFactoryManagementClient(credential, subscriptionId);
const result = await client.dataFlowDebugSession.addDataFlow(
resourceGroupName,
factoryName,
request
request,
);
console.log(result);
}
Expand Down
Loading
Loading