Azure Data Factory throws 'Length Required" error on copy from SQL to ADLS










3















I am trying to copy data from on-prem SQL server to Azure Data Lake Storage (ADLS) via Azure Data Factory (ADF). Everything seems to work, except when I run (debug or trigger) the pipeline, I get the error:





"errorCode": "2200",
"message": "Failure happened on 'Sink' side. ErrorCode=UserErrorAdlsFileWriteFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Writing to 'AzureDataLakeStore' failed. Message: The remote server returned an error: (411) Length Required.. Response details: rnLength Requiredrnrn


Length Required

rn

HTTP Error 411. The request must be chunked or have a content length.

rnrn,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,Message=The remote server returned an error: (411) Length Required.,Source=System,'",
"failureType": "UserError",
"target": "CopyData1"



What is really odd, is that the following pipelines DO work:



  • SQL tbl1 -> SQL tbl2

  • ADLS source.txt -> ADLS sink.txt

I.e. read/write access works as expected. The latter pipeline is also able to create/overwrite the sink.txt file.



But when I run the pipeline



  • SQL tbl1 -> sink.txt

I get the Length Required error. And if sink.txt exists, the pipeline even deletes it!



I'm using ADFv2, ADLS Gen1, ADF & ADLS resides in the same subscription/resource group, using selfhosted/Azure Integration Runtime (for SQL / ADLS respectively).
I have tested with source statement as simple as "SELECT 1 Col". Also tested without dataset schema, and with schemas+mappings.



Is this a bug, or am I missing something?
Which “Length” is required?




EDIT 1: Minimal JSON scripts



pipeline1.json




"name": "pipeline1",
"properties":
"activities": [

"name": "CopyData1",
"type": "Copy",
"policy":
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
,
"typeProperties":
"source":
"type": "SqlSource",
"sqlReaderQuery": "SELECT TOP 1 'x' AS col1 FROM sys.tables"
,
"sink":
"type": "AzureDataLakeStoreSink"
,
"enableStaging": false,
"dataIntegrationUnits": 0
,
"inputs": [

"referenceName": "table1",
"type": "DatasetReference"

],
"outputs": [

"referenceName": "sink1",
"type": "DatasetReference"

]

]




table1.json




"name": "table1",
"properties":
"linkedServiceName":
"referenceName": "SqlServer1",
"type": "LinkedServiceReference"
,
"type": "SqlServerTable",
"typeProperties":
"tableName": "sys.tables"





sink1.json




"name": "sink1",
"properties":
"linkedServiceName":
"referenceName": "AzureDataLakeStore1",
"type": "LinkedServiceReference"
,
"type": "AzureDataLakeStoreFile",
"structure": [

"name": "col1",
"type": "String"

],
"typeProperties":
"format":
"type": "TextFormat",
"columnDelimiter": ",",
"rowDelimiter": "",
"nullValue": "\N",
"treatEmptyAsNull": true,
"skipLineCount": 0,
"firstRowAsHeader": true
,
"fileName": "sink1.txt",
"folderPath": "myDir"






EDIT 2: Summary of conducted tests



  • SQL -> ADLS Error

  • Oracle -> ADLS Error

  • SQL -> Blob OK

  • Oracle -> Blob OK

  • SQL -> SQL OK

  • ADLS -> ADLS OK

  • AzureSQLDB -> ADLS OK









share|improve this question
























  • Google is not my friend. (411) Length Required is too generic.

    – Martin Thøgersen
    Nov 15 '18 at 14:14











  • Same issue occurs when sourcing from Oracle db to ADLS.

    – Martin Thøgersen
    Nov 16 '18 at 7:21











  • I can write from SQL/Oracle to Azure Blob storage without issues.

    – Martin Thøgersen
    Nov 16 '18 at 14:15











  • Is there anyway you can paste the JSON definitions for both datasets and the Pipeline with the CopyActivity and see if we can reproduce this?

    – Saul Cruz
    Nov 16 '18 at 21:04












  • Do you have a activity ID? Have you contact ADF support? What is your selfhosted IR version? Is it very old?

    – Fang Liu
    Nov 17 '18 at 10:44
















3















I am trying to copy data from on-prem SQL server to Azure Data Lake Storage (ADLS) via Azure Data Factory (ADF). Everything seems to work, except when I run (debug or trigger) the pipeline, I get the error:





"errorCode": "2200",
"message": "Failure happened on 'Sink' side. ErrorCode=UserErrorAdlsFileWriteFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Writing to 'AzureDataLakeStore' failed. Message: The remote server returned an error: (411) Length Required.. Response details: rnLength Requiredrnrn


Length Required

rn

HTTP Error 411. The request must be chunked or have a content length.

rnrn,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,Message=The remote server returned an error: (411) Length Required.,Source=System,'",
"failureType": "UserError",
"target": "CopyData1"



What is really odd, is that the following pipelines DO work:



  • SQL tbl1 -> SQL tbl2

  • ADLS source.txt -> ADLS sink.txt

I.e. read/write access works as expected. The latter pipeline is also able to create/overwrite the sink.txt file.



But when I run the pipeline



  • SQL tbl1 -> sink.txt

I get the Length Required error. And if sink.txt exists, the pipeline even deletes it!



I'm using ADFv2, ADLS Gen1, ADF & ADLS resides in the same subscription/resource group, using selfhosted/Azure Integration Runtime (for SQL / ADLS respectively).
I have tested with source statement as simple as "SELECT 1 Col". Also tested without dataset schema, and with schemas+mappings.



Is this a bug, or am I missing something?
Which “Length” is required?




EDIT 1: Minimal JSON scripts



pipeline1.json




"name": "pipeline1",
"properties":
"activities": [

"name": "CopyData1",
"type": "Copy",
"policy":
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
,
"typeProperties":
"source":
"type": "SqlSource",
"sqlReaderQuery": "SELECT TOP 1 'x' AS col1 FROM sys.tables"
,
"sink":
"type": "AzureDataLakeStoreSink"
,
"enableStaging": false,
"dataIntegrationUnits": 0
,
"inputs": [

"referenceName": "table1",
"type": "DatasetReference"

],
"outputs": [

"referenceName": "sink1",
"type": "DatasetReference"

]

]




table1.json




"name": "table1",
"properties":
"linkedServiceName":
"referenceName": "SqlServer1",
"type": "LinkedServiceReference"
,
"type": "SqlServerTable",
"typeProperties":
"tableName": "sys.tables"





sink1.json




"name": "sink1",
"properties":
"linkedServiceName":
"referenceName": "AzureDataLakeStore1",
"type": "LinkedServiceReference"
,
"type": "AzureDataLakeStoreFile",
"structure": [

"name": "col1",
"type": "String"

],
"typeProperties":
"format":
"type": "TextFormat",
"columnDelimiter": ",",
"rowDelimiter": "",
"nullValue": "\N",
"treatEmptyAsNull": true,
"skipLineCount": 0,
"firstRowAsHeader": true
,
"fileName": "sink1.txt",
"folderPath": "myDir"






EDIT 2: Summary of conducted tests



  • SQL -> ADLS Error

  • Oracle -> ADLS Error

  • SQL -> Blob OK

  • Oracle -> Blob OK

  • SQL -> SQL OK

  • ADLS -> ADLS OK

  • AzureSQLDB -> ADLS OK









share|improve this question
























  • Google is not my friend. (411) Length Required is too generic.

    – Martin Thøgersen
    Nov 15 '18 at 14:14











  • Same issue occurs when sourcing from Oracle db to ADLS.

    – Martin Thøgersen
    Nov 16 '18 at 7:21











  • I can write from SQL/Oracle to Azure Blob storage without issues.

    – Martin Thøgersen
    Nov 16 '18 at 14:15











  • Is there anyway you can paste the JSON definitions for both datasets and the Pipeline with the CopyActivity and see if we can reproduce this?

    – Saul Cruz
    Nov 16 '18 at 21:04












  • Do you have a activity ID? Have you contact ADF support? What is your selfhosted IR version? Is it very old?

    – Fang Liu
    Nov 17 '18 at 10:44














3












3








3








I am trying to copy data from on-prem SQL server to Azure Data Lake Storage (ADLS) via Azure Data Factory (ADF). Everything seems to work, except when I run (debug or trigger) the pipeline, I get the error:





"errorCode": "2200",
"message": "Failure happened on 'Sink' side. ErrorCode=UserErrorAdlsFileWriteFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Writing to 'AzureDataLakeStore' failed. Message: The remote server returned an error: (411) Length Required.. Response details: rnLength Requiredrnrn


Length Required

rn

HTTP Error 411. The request must be chunked or have a content length.

rnrn,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,Message=The remote server returned an error: (411) Length Required.,Source=System,'",
"failureType": "UserError",
"target": "CopyData1"



What is really odd, is that the following pipelines DO work:



  • SQL tbl1 -> SQL tbl2

  • ADLS source.txt -> ADLS sink.txt

I.e. read/write access works as expected. The latter pipeline is also able to create/overwrite the sink.txt file.



But when I run the pipeline



  • SQL tbl1 -> sink.txt

I get the Length Required error. And if sink.txt exists, the pipeline even deletes it!



I'm using ADFv2, ADLS Gen1, ADF & ADLS resides in the same subscription/resource group, using selfhosted/Azure Integration Runtime (for SQL / ADLS respectively).
I have tested with source statement as simple as "SELECT 1 Col". Also tested without dataset schema, and with schemas+mappings.



Is this a bug, or am I missing something?
Which “Length” is required?




EDIT 1: Minimal JSON scripts



pipeline1.json




"name": "pipeline1",
"properties":
"activities": [

"name": "CopyData1",
"type": "Copy",
"policy":
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
,
"typeProperties":
"source":
"type": "SqlSource",
"sqlReaderQuery": "SELECT TOP 1 'x' AS col1 FROM sys.tables"
,
"sink":
"type": "AzureDataLakeStoreSink"
,
"enableStaging": false,
"dataIntegrationUnits": 0
,
"inputs": [

"referenceName": "table1",
"type": "DatasetReference"

],
"outputs": [

"referenceName": "sink1",
"type": "DatasetReference"

]

]




table1.json




"name": "table1",
"properties":
"linkedServiceName":
"referenceName": "SqlServer1",
"type": "LinkedServiceReference"
,
"type": "SqlServerTable",
"typeProperties":
"tableName": "sys.tables"





sink1.json




"name": "sink1",
"properties":
"linkedServiceName":
"referenceName": "AzureDataLakeStore1",
"type": "LinkedServiceReference"
,
"type": "AzureDataLakeStoreFile",
"structure": [

"name": "col1",
"type": "String"

],
"typeProperties":
"format":
"type": "TextFormat",
"columnDelimiter": ",",
"rowDelimiter": "",
"nullValue": "\N",
"treatEmptyAsNull": true,
"skipLineCount": 0,
"firstRowAsHeader": true
,
"fileName": "sink1.txt",
"folderPath": "myDir"






EDIT 2: Summary of conducted tests



  • SQL -> ADLS Error

  • Oracle -> ADLS Error

  • SQL -> Blob OK

  • Oracle -> Blob OK

  • SQL -> SQL OK

  • ADLS -> ADLS OK

  • AzureSQLDB -> ADLS OK









share|improve this question
















I am trying to copy data from on-prem SQL server to Azure Data Lake Storage (ADLS) via Azure Data Factory (ADF). Everything seems to work, except when I run (debug or trigger) the pipeline, I get the error:





"errorCode": "2200",
"message": "Failure happened on 'Sink' side. ErrorCode=UserErrorAdlsFileWriteFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Writing to 'AzureDataLakeStore' failed. Message: The remote server returned an error: (411) Length Required.. Response details: rnLength Requiredrnrn


Length Required

rn

HTTP Error 411. The request must be chunked or have a content length.

rnrn,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,Message=The remote server returned an error: (411) Length Required.,Source=System,'",
"failureType": "UserError",
"target": "CopyData1"



What is really odd, is that the following pipelines DO work:



  • SQL tbl1 -> SQL tbl2

  • ADLS source.txt -> ADLS sink.txt

I.e. read/write access works as expected. The latter pipeline is also able to create/overwrite the sink.txt file.



But when I run the pipeline



  • SQL tbl1 -> sink.txt

I get the Length Required error. And if sink.txt exists, the pipeline even deletes it!



I'm using ADFv2, ADLS Gen1, ADF & ADLS resides in the same subscription/resource group, using selfhosted/Azure Integration Runtime (for SQL / ADLS respectively).
I have tested with source statement as simple as "SELECT 1 Col". Also tested without dataset schema, and with schemas+mappings.



Is this a bug, or am I missing something?
Which “Length” is required?




EDIT 1: Minimal JSON scripts



pipeline1.json




"name": "pipeline1",
"properties":
"activities": [

"name": "CopyData1",
"type": "Copy",
"policy":
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
,
"typeProperties":
"source":
"type": "SqlSource",
"sqlReaderQuery": "SELECT TOP 1 'x' AS col1 FROM sys.tables"
,
"sink":
"type": "AzureDataLakeStoreSink"
,
"enableStaging": false,
"dataIntegrationUnits": 0
,
"inputs": [

"referenceName": "table1",
"type": "DatasetReference"

],
"outputs": [

"referenceName": "sink1",
"type": "DatasetReference"

]

]




table1.json




"name": "table1",
"properties":
"linkedServiceName":
"referenceName": "SqlServer1",
"type": "LinkedServiceReference"
,
"type": "SqlServerTable",
"typeProperties":
"tableName": "sys.tables"





sink1.json




"name": "sink1",
"properties":
"linkedServiceName":
"referenceName": "AzureDataLakeStore1",
"type": "LinkedServiceReference"
,
"type": "AzureDataLakeStoreFile",
"structure": [

"name": "col1",
"type": "String"

],
"typeProperties":
"format":
"type": "TextFormat",
"columnDelimiter": ",",
"rowDelimiter": "",
"nullValue": "\N",
"treatEmptyAsNull": true,
"skipLineCount": 0,
"firstRowAsHeader": true
,
"fileName": "sink1.txt",
"folderPath": "myDir"






EDIT 2: Summary of conducted tests



  • SQL -> ADLS Error

  • Oracle -> ADLS Error

  • SQL -> Blob OK

  • Oracle -> Blob OK

  • SQL -> SQL OK

  • ADLS -> ADLS OK

  • AzureSQLDB -> ADLS OK






sql-server azure azure-data-lake azure-data-factory-2






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 19 '18 at 19:47







Martin Thøgersen

















asked Nov 15 '18 at 14:13









Martin ThøgersenMartin Thøgersen

427415




427415












  • Google is not my friend. (411) Length Required is too generic.

    – Martin Thøgersen
    Nov 15 '18 at 14:14











  • Same issue occurs when sourcing from Oracle db to ADLS.

    – Martin Thøgersen
    Nov 16 '18 at 7:21











  • I can write from SQL/Oracle to Azure Blob storage without issues.

    – Martin Thøgersen
    Nov 16 '18 at 14:15











  • Is there anyway you can paste the JSON definitions for both datasets and the Pipeline with the CopyActivity and see if we can reproduce this?

    – Saul Cruz
    Nov 16 '18 at 21:04












  • Do you have a activity ID? Have you contact ADF support? What is your selfhosted IR version? Is it very old?

    – Fang Liu
    Nov 17 '18 at 10:44


















  • Google is not my friend. (411) Length Required is too generic.

    – Martin Thøgersen
    Nov 15 '18 at 14:14











  • Same issue occurs when sourcing from Oracle db to ADLS.

    – Martin Thøgersen
    Nov 16 '18 at 7:21











  • I can write from SQL/Oracle to Azure Blob storage without issues.

    – Martin Thøgersen
    Nov 16 '18 at 14:15











  • Is there anyway you can paste the JSON definitions for both datasets and the Pipeline with the CopyActivity and see if we can reproduce this?

    – Saul Cruz
    Nov 16 '18 at 21:04












  • Do you have a activity ID? Have you contact ADF support? What is your selfhosted IR version? Is it very old?

    – Fang Liu
    Nov 17 '18 at 10:44

















Google is not my friend. (411) Length Required is too generic.

– Martin Thøgersen
Nov 15 '18 at 14:14





Google is not my friend. (411) Length Required is too generic.

– Martin Thøgersen
Nov 15 '18 at 14:14













Same issue occurs when sourcing from Oracle db to ADLS.

– Martin Thøgersen
Nov 16 '18 at 7:21





Same issue occurs when sourcing from Oracle db to ADLS.

– Martin Thøgersen
Nov 16 '18 at 7:21













I can write from SQL/Oracle to Azure Blob storage without issues.

– Martin Thøgersen
Nov 16 '18 at 14:15





I can write from SQL/Oracle to Azure Blob storage without issues.

– Martin Thøgersen
Nov 16 '18 at 14:15













Is there anyway you can paste the JSON definitions for both datasets and the Pipeline with the CopyActivity and see if we can reproduce this?

– Saul Cruz
Nov 16 '18 at 21:04






Is there anyway you can paste the JSON definitions for both datasets and the Pipeline with the CopyActivity and see if we can reproduce this?

– Saul Cruz
Nov 16 '18 at 21:04














Do you have a activity ID? Have you contact ADF support? What is your selfhosted IR version? Is it very old?

– Fang Liu
Nov 17 '18 at 10:44






Do you have a activity ID? Have you contact ADF support? What is your selfhosted IR version? Is it very old?

– Fang Liu
Nov 17 '18 at 10:44













1 Answer
1






active

oldest

votes


















1














Does your self-hosted IR has some proxy setting or goes through special network setting? Such error should be caused by the intermediate proxy service when ADF's ADLS connector tried to talk to the ADLS service.






share|improve this answer























  • I am not aware of any special proxy or network settings (unless you can be more specific). The selfhosted IR is installed on the development machine, but with the same network as the on-prem servers.

    – Martin Thøgersen
    Nov 23 '18 at 14:41












  • You comment seems a bit contradicting. How can the source side (Selfhosted IR and possible proxy issue) affect the sink side (Azure IR and ADLS)?

    – Martin Thøgersen
    Nov 23 '18 at 14:56











  • [SOLVED]: Whitelist azuredatalakestore.net in the proxy between the Self-hosted IR and ADLS. (In our case it was McAfee Web Gateway blocking it.) A key step in the solution was to understand that once ADF activities have been initilized by ADF, the data/communication goes directly from Self-hosted IR to ADLS, not through ADF. We initially misunderstood this. Self-hosted IR

    – Martin Thøgersen
    Dec 5 '18 at 11:41












Your Answer






StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53321370%2fazure-data-factory-throws-length-required-error-on-copy-from-sql-to-adls%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









1














Does your self-hosted IR has some proxy setting or goes through special network setting? Such error should be caused by the intermediate proxy service when ADF's ADLS connector tried to talk to the ADLS service.






share|improve this answer























  • I am not aware of any special proxy or network settings (unless you can be more specific). The selfhosted IR is installed on the development machine, but with the same network as the on-prem servers.

    – Martin Thøgersen
    Nov 23 '18 at 14:41












  • You comment seems a bit contradicting. How can the source side (Selfhosted IR and possible proxy issue) affect the sink side (Azure IR and ADLS)?

    – Martin Thøgersen
    Nov 23 '18 at 14:56











  • [SOLVED]: Whitelist azuredatalakestore.net in the proxy between the Self-hosted IR and ADLS. (In our case it was McAfee Web Gateway blocking it.) A key step in the solution was to understand that once ADF activities have been initilized by ADF, the data/communication goes directly from Self-hosted IR to ADLS, not through ADF. We initially misunderstood this. Self-hosted IR

    – Martin Thøgersen
    Dec 5 '18 at 11:41
















1














Does your self-hosted IR has some proxy setting or goes through special network setting? Such error should be caused by the intermediate proxy service when ADF's ADLS connector tried to talk to the ADLS service.






share|improve this answer























  • I am not aware of any special proxy or network settings (unless you can be more specific). The selfhosted IR is installed on the development machine, but with the same network as the on-prem servers.

    – Martin Thøgersen
    Nov 23 '18 at 14:41












  • You comment seems a bit contradicting. How can the source side (Selfhosted IR and possible proxy issue) affect the sink side (Azure IR and ADLS)?

    – Martin Thøgersen
    Nov 23 '18 at 14:56











  • [SOLVED]: Whitelist azuredatalakestore.net in the proxy between the Self-hosted IR and ADLS. (In our case it was McAfee Web Gateway blocking it.) A key step in the solution was to understand that once ADF activities have been initilized by ADF, the data/communication goes directly from Self-hosted IR to ADLS, not through ADF. We initially misunderstood this. Self-hosted IR

    – Martin Thøgersen
    Dec 5 '18 at 11:41














1












1








1







Does your self-hosted IR has some proxy setting or goes through special network setting? Such error should be caused by the intermediate proxy service when ADF's ADLS connector tried to talk to the ADLS service.






share|improve this answer













Does your self-hosted IR has some proxy setting or goes through special network setting? Such error should be caused by the intermediate proxy service when ADF's ADLS connector tried to talk to the ADLS service.







share|improve this answer












share|improve this answer



share|improve this answer










answered Nov 19 '18 at 9:32









YingqinYingqin

1543




1543












  • I am not aware of any special proxy or network settings (unless you can be more specific). The selfhosted IR is installed on the development machine, but with the same network as the on-prem servers.

    – Martin Thøgersen
    Nov 23 '18 at 14:41












  • You comment seems a bit contradicting. How can the source side (Selfhosted IR and possible proxy issue) affect the sink side (Azure IR and ADLS)?

    – Martin Thøgersen
    Nov 23 '18 at 14:56











  • [SOLVED]: Whitelist azuredatalakestore.net in the proxy between the Self-hosted IR and ADLS. (In our case it was McAfee Web Gateway blocking it.) A key step in the solution was to understand that once ADF activities have been initilized by ADF, the data/communication goes directly from Self-hosted IR to ADLS, not through ADF. We initially misunderstood this. Self-hosted IR

    – Martin Thøgersen
    Dec 5 '18 at 11:41


















  • I am not aware of any special proxy or network settings (unless you can be more specific). The selfhosted IR is installed on the development machine, but with the same network as the on-prem servers.

    – Martin Thøgersen
    Nov 23 '18 at 14:41












  • You comment seems a bit contradicting. How can the source side (Selfhosted IR and possible proxy issue) affect the sink side (Azure IR and ADLS)?

    – Martin Thøgersen
    Nov 23 '18 at 14:56











  • [SOLVED]: Whitelist azuredatalakestore.net in the proxy between the Self-hosted IR and ADLS. (In our case it was McAfee Web Gateway blocking it.) A key step in the solution was to understand that once ADF activities have been initilized by ADF, the data/communication goes directly from Self-hosted IR to ADLS, not through ADF. We initially misunderstood this. Self-hosted IR

    – Martin Thøgersen
    Dec 5 '18 at 11:41

















I am not aware of any special proxy or network settings (unless you can be more specific). The selfhosted IR is installed on the development machine, but with the same network as the on-prem servers.

– Martin Thøgersen
Nov 23 '18 at 14:41






I am not aware of any special proxy or network settings (unless you can be more specific). The selfhosted IR is installed on the development machine, but with the same network as the on-prem servers.

– Martin Thøgersen
Nov 23 '18 at 14:41














You comment seems a bit contradicting. How can the source side (Selfhosted IR and possible proxy issue) affect the sink side (Azure IR and ADLS)?

– Martin Thøgersen
Nov 23 '18 at 14:56





You comment seems a bit contradicting. How can the source side (Selfhosted IR and possible proxy issue) affect the sink side (Azure IR and ADLS)?

– Martin Thøgersen
Nov 23 '18 at 14:56













[SOLVED]: Whitelist azuredatalakestore.net in the proxy between the Self-hosted IR and ADLS. (In our case it was McAfee Web Gateway blocking it.) A key step in the solution was to understand that once ADF activities have been initilized by ADF, the data/communication goes directly from Self-hosted IR to ADLS, not through ADF. We initially misunderstood this. Self-hosted IR

– Martin Thøgersen
Dec 5 '18 at 11:41






[SOLVED]: Whitelist azuredatalakestore.net in the proxy between the Self-hosted IR and ADLS. (In our case it was McAfee Web Gateway blocking it.) A key step in the solution was to understand that once ADF activities have been initilized by ADF, the data/communication goes directly from Self-hosted IR to ADLS, not through ADF. We initially misunderstood this. Self-hosted IR

– Martin Thøgersen
Dec 5 '18 at 11:41




















draft saved

draft discarded
















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53321370%2fazure-data-factory-throws-length-required-error-on-copy-from-sql-to-adls%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







這個網誌中的熱門文章

How to read a connectionString WITH PROVIDER in .NET Core?

Node.js Script on GitHub Pages or Amazon S3

Museum of Modern and Contemporary Art of Trento and Rovereto