Azure Data Factory throws 'Length Required" error on copy from SQL to ADLS
I am trying to copy data from on-prem SQL server to Azure Data Lake Storage (ADLS) via Azure Data Factory (ADF). Everything seems to work, except when I run (debug or trigger) the pipeline, I get the error:
"errorCode": "2200",
"message": "Failure happened on 'Sink' side. ErrorCode=UserErrorAdlsFileWriteFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Writing to 'AzureDataLakeStore' failed. Message: The remote server returned an error: (411) Length Required.. Response details: rnLength RequiredrnrnLength Required
rnHTTP Error 411. The request must be chunked or have a content length.
rnrn,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,Message=The remote server returned an error: (411) Length Required.,Source=System,'",
"failureType": "UserError",
"target": "CopyData1"
What is really odd, is that the following pipelines DO work:
- SQL tbl1 -> SQL tbl2
- ADLS source.txt -> ADLS sink.txt
I.e. read/write access works as expected. The latter pipeline is also able to create/overwrite the sink.txt file.
But when I run the pipeline
- SQL tbl1 -> sink.txt
I get the Length Required error. And if sink.txt exists, the pipeline even deletes it!
I'm using ADFv2, ADLS Gen1, ADF & ADLS resides in the same subscription/resource group, using selfhosted/Azure Integration Runtime (for SQL / ADLS respectively).
I have tested with source statement as simple as "SELECT 1 Col". Also tested without dataset schema, and with schemas+mappings.
Is this a bug, or am I missing something?
Which “Length” is required?
EDIT 1: Minimal JSON scripts
pipeline1.json
"name": "pipeline1",
"properties":
"activities": [
"name": "CopyData1",
"type": "Copy",
"policy":
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
,
"typeProperties":
"source":
"type": "SqlSource",
"sqlReaderQuery": "SELECT TOP 1 'x' AS col1 FROM sys.tables"
,
"sink":
"type": "AzureDataLakeStoreSink"
,
"enableStaging": false,
"dataIntegrationUnits": 0
,
"inputs": [
"referenceName": "table1",
"type": "DatasetReference"
],
"outputs": [
"referenceName": "sink1",
"type": "DatasetReference"
]
]
table1.json
"name": "table1",
"properties":
"linkedServiceName":
"referenceName": "SqlServer1",
"type": "LinkedServiceReference"
,
"type": "SqlServerTable",
"typeProperties":
"tableName": "sys.tables"
sink1.json
"name": "sink1",
"properties":
"linkedServiceName":
"referenceName": "AzureDataLakeStore1",
"type": "LinkedServiceReference"
,
"type": "AzureDataLakeStoreFile",
"structure": [
"name": "col1",
"type": "String"
],
"typeProperties":
"format":
"type": "TextFormat",
"columnDelimiter": ",",
"rowDelimiter": "",
"nullValue": "\N",
"treatEmptyAsNull": true,
"skipLineCount": 0,
"firstRowAsHeader": true
,
"fileName": "sink1.txt",
"folderPath": "myDir"
EDIT 2: Summary of conducted tests
- SQL -> ADLS Error
- Oracle -> ADLS Error
- SQL -> Blob OK
- Oracle -> Blob OK
- SQL -> SQL OK
- ADLS -> ADLS OK
- AzureSQLDB -> ADLS OK
sql-server azure azure-data-lake azure-data-factory-2
|
show 6 more comments
I am trying to copy data from on-prem SQL server to Azure Data Lake Storage (ADLS) via Azure Data Factory (ADF). Everything seems to work, except when I run (debug or trigger) the pipeline, I get the error:
"errorCode": "2200",
"message": "Failure happened on 'Sink' side. ErrorCode=UserErrorAdlsFileWriteFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Writing to 'AzureDataLakeStore' failed. Message: The remote server returned an error: (411) Length Required.. Response details: rnLength RequiredrnrnLength Required
rnHTTP Error 411. The request must be chunked or have a content length.
rnrn,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,Message=The remote server returned an error: (411) Length Required.,Source=System,'",
"failureType": "UserError",
"target": "CopyData1"
What is really odd, is that the following pipelines DO work:
- SQL tbl1 -> SQL tbl2
- ADLS source.txt -> ADLS sink.txt
I.e. read/write access works as expected. The latter pipeline is also able to create/overwrite the sink.txt file.
But when I run the pipeline
- SQL tbl1 -> sink.txt
I get the Length Required error. And if sink.txt exists, the pipeline even deletes it!
I'm using ADFv2, ADLS Gen1, ADF & ADLS resides in the same subscription/resource group, using selfhosted/Azure Integration Runtime (for SQL / ADLS respectively).
I have tested with source statement as simple as "SELECT 1 Col". Also tested without dataset schema, and with schemas+mappings.
Is this a bug, or am I missing something?
Which “Length” is required?
EDIT 1: Minimal JSON scripts
pipeline1.json
"name": "pipeline1",
"properties":
"activities": [
"name": "CopyData1",
"type": "Copy",
"policy":
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
,
"typeProperties":
"source":
"type": "SqlSource",
"sqlReaderQuery": "SELECT TOP 1 'x' AS col1 FROM sys.tables"
,
"sink":
"type": "AzureDataLakeStoreSink"
,
"enableStaging": false,
"dataIntegrationUnits": 0
,
"inputs": [
"referenceName": "table1",
"type": "DatasetReference"
],
"outputs": [
"referenceName": "sink1",
"type": "DatasetReference"
]
]
table1.json
"name": "table1",
"properties":
"linkedServiceName":
"referenceName": "SqlServer1",
"type": "LinkedServiceReference"
,
"type": "SqlServerTable",
"typeProperties":
"tableName": "sys.tables"
sink1.json
"name": "sink1",
"properties":
"linkedServiceName":
"referenceName": "AzureDataLakeStore1",
"type": "LinkedServiceReference"
,
"type": "AzureDataLakeStoreFile",
"structure": [
"name": "col1",
"type": "String"
],
"typeProperties":
"format":
"type": "TextFormat",
"columnDelimiter": ",",
"rowDelimiter": "",
"nullValue": "\N",
"treatEmptyAsNull": true,
"skipLineCount": 0,
"firstRowAsHeader": true
,
"fileName": "sink1.txt",
"folderPath": "myDir"
EDIT 2: Summary of conducted tests
- SQL -> ADLS Error
- Oracle -> ADLS Error
- SQL -> Blob OK
- Oracle -> Blob OK
- SQL -> SQL OK
- ADLS -> ADLS OK
- AzureSQLDB -> ADLS OK
sql-server azure azure-data-lake azure-data-factory-2
Google is not my friend. (411) Length Required is too generic.
– Martin Thøgersen
Nov 15 '18 at 14:14
Same issue occurs when sourcing from Oracle db to ADLS.
– Martin Thøgersen
Nov 16 '18 at 7:21
I can write from SQL/Oracle to Azure Blob storage without issues.
– Martin Thøgersen
Nov 16 '18 at 14:15
Is there anyway you can paste the JSON definitions for both datasets and the Pipeline with the CopyActivity and see if we can reproduce this?
– Saul Cruz
Nov 16 '18 at 21:04
Do you have a activity ID? Have you contact ADF support? What is your selfhosted IR version? Is it very old?
– Fang Liu
Nov 17 '18 at 10:44
|
show 6 more comments
I am trying to copy data from on-prem SQL server to Azure Data Lake Storage (ADLS) via Azure Data Factory (ADF). Everything seems to work, except when I run (debug or trigger) the pipeline, I get the error:
"errorCode": "2200",
"message": "Failure happened on 'Sink' side. ErrorCode=UserErrorAdlsFileWriteFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Writing to 'AzureDataLakeStore' failed. Message: The remote server returned an error: (411) Length Required.. Response details: rnLength RequiredrnrnLength Required
rnHTTP Error 411. The request must be chunked or have a content length.
rnrn,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,Message=The remote server returned an error: (411) Length Required.,Source=System,'",
"failureType": "UserError",
"target": "CopyData1"
What is really odd, is that the following pipelines DO work:
- SQL tbl1 -> SQL tbl2
- ADLS source.txt -> ADLS sink.txt
I.e. read/write access works as expected. The latter pipeline is also able to create/overwrite the sink.txt file.
But when I run the pipeline
- SQL tbl1 -> sink.txt
I get the Length Required error. And if sink.txt exists, the pipeline even deletes it!
I'm using ADFv2, ADLS Gen1, ADF & ADLS resides in the same subscription/resource group, using selfhosted/Azure Integration Runtime (for SQL / ADLS respectively).
I have tested with source statement as simple as "SELECT 1 Col". Also tested without dataset schema, and with schemas+mappings.
Is this a bug, or am I missing something?
Which “Length” is required?
EDIT 1: Minimal JSON scripts
pipeline1.json
"name": "pipeline1",
"properties":
"activities": [
"name": "CopyData1",
"type": "Copy",
"policy":
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
,
"typeProperties":
"source":
"type": "SqlSource",
"sqlReaderQuery": "SELECT TOP 1 'x' AS col1 FROM sys.tables"
,
"sink":
"type": "AzureDataLakeStoreSink"
,
"enableStaging": false,
"dataIntegrationUnits": 0
,
"inputs": [
"referenceName": "table1",
"type": "DatasetReference"
],
"outputs": [
"referenceName": "sink1",
"type": "DatasetReference"
]
]
table1.json
"name": "table1",
"properties":
"linkedServiceName":
"referenceName": "SqlServer1",
"type": "LinkedServiceReference"
,
"type": "SqlServerTable",
"typeProperties":
"tableName": "sys.tables"
sink1.json
"name": "sink1",
"properties":
"linkedServiceName":
"referenceName": "AzureDataLakeStore1",
"type": "LinkedServiceReference"
,
"type": "AzureDataLakeStoreFile",
"structure": [
"name": "col1",
"type": "String"
],
"typeProperties":
"format":
"type": "TextFormat",
"columnDelimiter": ",",
"rowDelimiter": "",
"nullValue": "\N",
"treatEmptyAsNull": true,
"skipLineCount": 0,
"firstRowAsHeader": true
,
"fileName": "sink1.txt",
"folderPath": "myDir"
EDIT 2: Summary of conducted tests
- SQL -> ADLS Error
- Oracle -> ADLS Error
- SQL -> Blob OK
- Oracle -> Blob OK
- SQL -> SQL OK
- ADLS -> ADLS OK
- AzureSQLDB -> ADLS OK
sql-server azure azure-data-lake azure-data-factory-2
I am trying to copy data from on-prem SQL server to Azure Data Lake Storage (ADLS) via Azure Data Factory (ADF). Everything seems to work, except when I run (debug or trigger) the pipeline, I get the error:
"errorCode": "2200",
"message": "Failure happened on 'Sink' side. ErrorCode=UserErrorAdlsFileWriteFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Writing to 'AzureDataLakeStore' failed. Message: The remote server returned an error: (411) Length Required.. Response details: rnLength RequiredrnrnLength Required
rnHTTP Error 411. The request must be chunked or have a content length.
rnrn,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,Message=The remote server returned an error: (411) Length Required.,Source=System,'",
"failureType": "UserError",
"target": "CopyData1"
What is really odd, is that the following pipelines DO work:
- SQL tbl1 -> SQL tbl2
- ADLS source.txt -> ADLS sink.txt
I.e. read/write access works as expected. The latter pipeline is also able to create/overwrite the sink.txt file.
But when I run the pipeline
- SQL tbl1 -> sink.txt
I get the Length Required error. And if sink.txt exists, the pipeline even deletes it!
I'm using ADFv2, ADLS Gen1, ADF & ADLS resides in the same subscription/resource group, using selfhosted/Azure Integration Runtime (for SQL / ADLS respectively).
I have tested with source statement as simple as "SELECT 1 Col". Also tested without dataset schema, and with schemas+mappings.
Is this a bug, or am I missing something?
Which “Length” is required?
EDIT 1: Minimal JSON scripts
pipeline1.json
"name": "pipeline1",
"properties":
"activities": [
"name": "CopyData1",
"type": "Copy",
"policy":
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
,
"typeProperties":
"source":
"type": "SqlSource",
"sqlReaderQuery": "SELECT TOP 1 'x' AS col1 FROM sys.tables"
,
"sink":
"type": "AzureDataLakeStoreSink"
,
"enableStaging": false,
"dataIntegrationUnits": 0
,
"inputs": [
"referenceName": "table1",
"type": "DatasetReference"
],
"outputs": [
"referenceName": "sink1",
"type": "DatasetReference"
]
]
table1.json
"name": "table1",
"properties":
"linkedServiceName":
"referenceName": "SqlServer1",
"type": "LinkedServiceReference"
,
"type": "SqlServerTable",
"typeProperties":
"tableName": "sys.tables"
sink1.json
"name": "sink1",
"properties":
"linkedServiceName":
"referenceName": "AzureDataLakeStore1",
"type": "LinkedServiceReference"
,
"type": "AzureDataLakeStoreFile",
"structure": [
"name": "col1",
"type": "String"
],
"typeProperties":
"format":
"type": "TextFormat",
"columnDelimiter": ",",
"rowDelimiter": "",
"nullValue": "\N",
"treatEmptyAsNull": true,
"skipLineCount": 0,
"firstRowAsHeader": true
,
"fileName": "sink1.txt",
"folderPath": "myDir"
EDIT 2: Summary of conducted tests
- SQL -> ADLS Error
- Oracle -> ADLS Error
- SQL -> Blob OK
- Oracle -> Blob OK
- SQL -> SQL OK
- ADLS -> ADLS OK
- AzureSQLDB -> ADLS OK
sql-server azure azure-data-lake azure-data-factory-2
sql-server azure azure-data-lake azure-data-factory-2
edited Nov 19 '18 at 19:47
Martin Thøgersen
asked Nov 15 '18 at 14:13
Martin ThøgersenMartin Thøgersen
427415
427415
Google is not my friend. (411) Length Required is too generic.
– Martin Thøgersen
Nov 15 '18 at 14:14
Same issue occurs when sourcing from Oracle db to ADLS.
– Martin Thøgersen
Nov 16 '18 at 7:21
I can write from SQL/Oracle to Azure Blob storage without issues.
– Martin Thøgersen
Nov 16 '18 at 14:15
Is there anyway you can paste the JSON definitions for both datasets and the Pipeline with the CopyActivity and see if we can reproduce this?
– Saul Cruz
Nov 16 '18 at 21:04
Do you have a activity ID? Have you contact ADF support? What is your selfhosted IR version? Is it very old?
– Fang Liu
Nov 17 '18 at 10:44
|
show 6 more comments
Google is not my friend. (411) Length Required is too generic.
– Martin Thøgersen
Nov 15 '18 at 14:14
Same issue occurs when sourcing from Oracle db to ADLS.
– Martin Thøgersen
Nov 16 '18 at 7:21
I can write from SQL/Oracle to Azure Blob storage without issues.
– Martin Thøgersen
Nov 16 '18 at 14:15
Is there anyway you can paste the JSON definitions for both datasets and the Pipeline with the CopyActivity and see if we can reproduce this?
– Saul Cruz
Nov 16 '18 at 21:04
Do you have a activity ID? Have you contact ADF support? What is your selfhosted IR version? Is it very old?
– Fang Liu
Nov 17 '18 at 10:44
Google is not my friend. (411) Length Required is too generic.
– Martin Thøgersen
Nov 15 '18 at 14:14
Google is not my friend. (411) Length Required is too generic.
– Martin Thøgersen
Nov 15 '18 at 14:14
Same issue occurs when sourcing from Oracle db to ADLS.
– Martin Thøgersen
Nov 16 '18 at 7:21
Same issue occurs when sourcing from Oracle db to ADLS.
– Martin Thøgersen
Nov 16 '18 at 7:21
I can write from SQL/Oracle to Azure Blob storage without issues.
– Martin Thøgersen
Nov 16 '18 at 14:15
I can write from SQL/Oracle to Azure Blob storage without issues.
– Martin Thøgersen
Nov 16 '18 at 14:15
Is there anyway you can paste the JSON definitions for both datasets and the Pipeline with the CopyActivity and see if we can reproduce this?
– Saul Cruz
Nov 16 '18 at 21:04
Is there anyway you can paste the JSON definitions for both datasets and the Pipeline with the CopyActivity and see if we can reproduce this?
– Saul Cruz
Nov 16 '18 at 21:04
Do you have a activity ID? Have you contact ADF support? What is your selfhosted IR version? Is it very old?
– Fang Liu
Nov 17 '18 at 10:44
Do you have a activity ID? Have you contact ADF support? What is your selfhosted IR version? Is it very old?
– Fang Liu
Nov 17 '18 at 10:44
|
show 6 more comments
1 Answer
1
active
oldest
votes
Does your self-hosted IR has some proxy setting or goes through special network setting? Such error should be caused by the intermediate proxy service when ADF's ADLS connector tried to talk to the ADLS service.
I am not aware of any special proxy or network settings (unless you can be more specific). The selfhosted IR is installed on the development machine, but with the same network as the on-prem servers.
– Martin Thøgersen
Nov 23 '18 at 14:41
You comment seems a bit contradicting. How can the source side (Selfhosted IR and possible proxy issue) affect the sink side (Azure IR and ADLS)?
– Martin Thøgersen
Nov 23 '18 at 14:56
[SOLVED]: Whitelistazuredatalakestore.net
in the proxy between the Self-hosted IR and ADLS. (In our case it was McAfee Web Gateway blocking it.) A key step in the solution was to understand that once ADF activities have been initilized by ADF, the data/communication goes directly from Self-hosted IR to ADLS, not through ADF. We initially misunderstood this. Self-hosted IR
– Martin Thøgersen
Dec 5 '18 at 11:41
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53321370%2fazure-data-factory-throws-length-required-error-on-copy-from-sql-to-adls%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
Does your self-hosted IR has some proxy setting or goes through special network setting? Such error should be caused by the intermediate proxy service when ADF's ADLS connector tried to talk to the ADLS service.
I am not aware of any special proxy or network settings (unless you can be more specific). The selfhosted IR is installed on the development machine, but with the same network as the on-prem servers.
– Martin Thøgersen
Nov 23 '18 at 14:41
You comment seems a bit contradicting. How can the source side (Selfhosted IR and possible proxy issue) affect the sink side (Azure IR and ADLS)?
– Martin Thøgersen
Nov 23 '18 at 14:56
[SOLVED]: Whitelistazuredatalakestore.net
in the proxy between the Self-hosted IR and ADLS. (In our case it was McAfee Web Gateway blocking it.) A key step in the solution was to understand that once ADF activities have been initilized by ADF, the data/communication goes directly from Self-hosted IR to ADLS, not through ADF. We initially misunderstood this. Self-hosted IR
– Martin Thøgersen
Dec 5 '18 at 11:41
add a comment |
Does your self-hosted IR has some proxy setting or goes through special network setting? Such error should be caused by the intermediate proxy service when ADF's ADLS connector tried to talk to the ADLS service.
I am not aware of any special proxy or network settings (unless you can be more specific). The selfhosted IR is installed on the development machine, but with the same network as the on-prem servers.
– Martin Thøgersen
Nov 23 '18 at 14:41
You comment seems a bit contradicting. How can the source side (Selfhosted IR and possible proxy issue) affect the sink side (Azure IR and ADLS)?
– Martin Thøgersen
Nov 23 '18 at 14:56
[SOLVED]: Whitelistazuredatalakestore.net
in the proxy between the Self-hosted IR and ADLS. (In our case it was McAfee Web Gateway blocking it.) A key step in the solution was to understand that once ADF activities have been initilized by ADF, the data/communication goes directly from Self-hosted IR to ADLS, not through ADF. We initially misunderstood this. Self-hosted IR
– Martin Thøgersen
Dec 5 '18 at 11:41
add a comment |
Does your self-hosted IR has some proxy setting or goes through special network setting? Such error should be caused by the intermediate proxy service when ADF's ADLS connector tried to talk to the ADLS service.
Does your self-hosted IR has some proxy setting or goes through special network setting? Such error should be caused by the intermediate proxy service when ADF's ADLS connector tried to talk to the ADLS service.
answered Nov 19 '18 at 9:32
YingqinYingqin
1543
1543
I am not aware of any special proxy or network settings (unless you can be more specific). The selfhosted IR is installed on the development machine, but with the same network as the on-prem servers.
– Martin Thøgersen
Nov 23 '18 at 14:41
You comment seems a bit contradicting. How can the source side (Selfhosted IR and possible proxy issue) affect the sink side (Azure IR and ADLS)?
– Martin Thøgersen
Nov 23 '18 at 14:56
[SOLVED]: Whitelistazuredatalakestore.net
in the proxy between the Self-hosted IR and ADLS. (In our case it was McAfee Web Gateway blocking it.) A key step in the solution was to understand that once ADF activities have been initilized by ADF, the data/communication goes directly from Self-hosted IR to ADLS, not through ADF. We initially misunderstood this. Self-hosted IR
– Martin Thøgersen
Dec 5 '18 at 11:41
add a comment |
I am not aware of any special proxy or network settings (unless you can be more specific). The selfhosted IR is installed on the development machine, but with the same network as the on-prem servers.
– Martin Thøgersen
Nov 23 '18 at 14:41
You comment seems a bit contradicting. How can the source side (Selfhosted IR and possible proxy issue) affect the sink side (Azure IR and ADLS)?
– Martin Thøgersen
Nov 23 '18 at 14:56
[SOLVED]: Whitelistazuredatalakestore.net
in the proxy between the Self-hosted IR and ADLS. (In our case it was McAfee Web Gateway blocking it.) A key step in the solution was to understand that once ADF activities have been initilized by ADF, the data/communication goes directly from Self-hosted IR to ADLS, not through ADF. We initially misunderstood this. Self-hosted IR
– Martin Thøgersen
Dec 5 '18 at 11:41
I am not aware of any special proxy or network settings (unless you can be more specific). The selfhosted IR is installed on the development machine, but with the same network as the on-prem servers.
– Martin Thøgersen
Nov 23 '18 at 14:41
I am not aware of any special proxy or network settings (unless you can be more specific). The selfhosted IR is installed on the development machine, but with the same network as the on-prem servers.
– Martin Thøgersen
Nov 23 '18 at 14:41
You comment seems a bit contradicting. How can the source side (Selfhosted IR and possible proxy issue) affect the sink side (Azure IR and ADLS)?
– Martin Thøgersen
Nov 23 '18 at 14:56
You comment seems a bit contradicting. How can the source side (Selfhosted IR and possible proxy issue) affect the sink side (Azure IR and ADLS)?
– Martin Thøgersen
Nov 23 '18 at 14:56
[SOLVED]: Whitelist
azuredatalakestore.net
in the proxy between the Self-hosted IR and ADLS. (In our case it was McAfee Web Gateway blocking it.) A key step in the solution was to understand that once ADF activities have been initilized by ADF, the data/communication goes directly from Self-hosted IR to ADLS, not through ADF. We initially misunderstood this. Self-hosted IR– Martin Thøgersen
Dec 5 '18 at 11:41
[SOLVED]: Whitelist
azuredatalakestore.net
in the proxy between the Self-hosted IR and ADLS. (In our case it was McAfee Web Gateway blocking it.) A key step in the solution was to understand that once ADF activities have been initilized by ADF, the data/communication goes directly from Self-hosted IR to ADLS, not through ADF. We initially misunderstood this. Self-hosted IR– Martin Thøgersen
Dec 5 '18 at 11:41
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53321370%2fazure-data-factory-throws-length-required-error-on-copy-from-sql-to-adls%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Google is not my friend. (411) Length Required is too generic.
– Martin Thøgersen
Nov 15 '18 at 14:14
Same issue occurs when sourcing from Oracle db to ADLS.
– Martin Thøgersen
Nov 16 '18 at 7:21
I can write from SQL/Oracle to Azure Blob storage without issues.
– Martin Thøgersen
Nov 16 '18 at 14:15
Is there anyway you can paste the JSON definitions for both datasets and the Pipeline with the CopyActivity and see if we can reproduce this?
– Saul Cruz
Nov 16 '18 at 21:04
Do you have a activity ID? Have you contact ADF support? What is your selfhosted IR version? Is it very old?
– Fang Liu
Nov 17 '18 at 10:44