Qlikview - Insert a recno() for each change in a column
I have a table with Document Numbers and items. We have integrated our software to a third party accounting package. The import to the third party does not take our document line numbers, however, it posts into their backend in the same order.
For Example:
My Software:
The import to the Third Party Software (only takes limited fields as they do not manage stock/batch controlled stock):
My goal is to create a new column in the second table to add row numbers for each change in Document numbers. This will allow me to create a unique key that is what I need to link the two databases' tables together.
sql qlikview
add a comment |
I have a table with Document Numbers and items. We have integrated our software to a third party accounting package. The import to the third party does not take our document line numbers, however, it posts into their backend in the same order.
For Example:
My Software:
The import to the Third Party Software (only takes limited fields as they do not manage stock/batch controlled stock):
My goal is to create a new column in the second table to add row numbers for each change in Document numbers. This will allow me to create a unique key that is what I need to link the two databases' tables together.
sql qlikview
add a comment |
I have a table with Document Numbers and items. We have integrated our software to a third party accounting package. The import to the third party does not take our document line numbers, however, it posts into their backend in the same order.
For Example:
My Software:
The import to the Third Party Software (only takes limited fields as they do not manage stock/batch controlled stock):
My goal is to create a new column in the second table to add row numbers for each change in Document numbers. This will allow me to create a unique key that is what I need to link the two databases' tables together.
sql qlikview
I have a table with Document Numbers and items. We have integrated our software to a third party accounting package. The import to the third party does not take our document line numbers, however, it posts into their backend in the same order.
For Example:
My Software:
The import to the Third Party Software (only takes limited fields as they do not manage stock/batch controlled stock):
My goal is to create a new column in the second table to add row numbers for each change in Document numbers. This will allow me to create a unique key that is what I need to link the two databases' tables together.
sql qlikview
sql qlikview
asked Nov 13 '18 at 10:30
AgentShwangAgentShwang
134
134
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
If I understand the issue correctly, what you want to do is to start with [Row. No.] 1 for every new value in [Doc. No.], and increment the field [Row. No.] by 1 as long as [Doc. No.] is the same as the previous row. One way to accomplish that could be the following:
//load the table, in this example it is an inline table
//but you would load it from another source
table2:
load * inline [
Doc. No.,Description,Qty,Batch,Value
Doc 1, Item1, 1, 10
Doc 1, Item1, 2, 10
Doc 1, Item1, 3, 10
Doc 2, Item2, 1, 20
Doc 3, Item3, 1, 30
Doc 3, Item3, 1, 30
Doc 3, Item3, 1, 30
];
//define an empty table that into which values can be "concatenateloaded"
newTable2:
load * inline [
dummy
1
];
//define starting values to be used in the loop
let rownodigit = 0;
let lastdocno = '';
//Iterate over each row in the table, and use the values to build
//the new table row for row. If the table is large this would
//require a significant amount of time...
FOR rowno = 0 TO noOfRows('table2')-1
let docno = peek('Doc. No.', rowno, 'table2');
let desc = peek('Description', rowno, 'table2');
let qty = peek('Qty', rowno, 'table2');
let batch = peek('Batch', rowno, 'table2');
let value = peek('Value', rowno, 'table2');
//determine what value the [Row. No.] field is to be given
if docno=lastdocno then
rownodigit = rownodigit + 1;
else
rownodigit = 1
endif
//build the table by generating a new row into the new table
concatenate (newTable2)
load
'$(docno)' as [Doc. No.],
'$(desc)' as [Description],
$(qty) as [Qty],
$(batch) as [Batch],
$(value) as [Value],
$(rownodigit) as [Row. No.]
autogenerate (1)
;
let lastdocno = docno; //store the value of docno into a new variable
//for comparison in the top of the loop in
//the next iteration
NEXT rowno
drop field dummy; //this field was only needed to create the temporary table
drop table table2; //drop the orgiginal table, otherwise we would have
//a lot of synthetic keys
//now fix the table and create the [Key] field
table2:
load *, [Doc. No.]&[Row. No.]&[Description] as Key
resident newTable2
where len([Doc. No.])>0 //this avoids the blank row generated
//by the first dummy insert
;
//this was a temporary table that we generated and it should now be dropped
drop table newTable2;
Works perfectly. I couldn't figure out the logic. Makes perfect sense seeing it work. Thank you
– AgentShwang
Nov 16 '18 at 13:45
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53278967%2fqlikview-insert-a-recno-for-each-change-in-a-column%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
If I understand the issue correctly, what you want to do is to start with [Row. No.] 1 for every new value in [Doc. No.], and increment the field [Row. No.] by 1 as long as [Doc. No.] is the same as the previous row. One way to accomplish that could be the following:
//load the table, in this example it is an inline table
//but you would load it from another source
table2:
load * inline [
Doc. No.,Description,Qty,Batch,Value
Doc 1, Item1, 1, 10
Doc 1, Item1, 2, 10
Doc 1, Item1, 3, 10
Doc 2, Item2, 1, 20
Doc 3, Item3, 1, 30
Doc 3, Item3, 1, 30
Doc 3, Item3, 1, 30
];
//define an empty table that into which values can be "concatenateloaded"
newTable2:
load * inline [
dummy
1
];
//define starting values to be used in the loop
let rownodigit = 0;
let lastdocno = '';
//Iterate over each row in the table, and use the values to build
//the new table row for row. If the table is large this would
//require a significant amount of time...
FOR rowno = 0 TO noOfRows('table2')-1
let docno = peek('Doc. No.', rowno, 'table2');
let desc = peek('Description', rowno, 'table2');
let qty = peek('Qty', rowno, 'table2');
let batch = peek('Batch', rowno, 'table2');
let value = peek('Value', rowno, 'table2');
//determine what value the [Row. No.] field is to be given
if docno=lastdocno then
rownodigit = rownodigit + 1;
else
rownodigit = 1
endif
//build the table by generating a new row into the new table
concatenate (newTable2)
load
'$(docno)' as [Doc. No.],
'$(desc)' as [Description],
$(qty) as [Qty],
$(batch) as [Batch],
$(value) as [Value],
$(rownodigit) as [Row. No.]
autogenerate (1)
;
let lastdocno = docno; //store the value of docno into a new variable
//for comparison in the top of the loop in
//the next iteration
NEXT rowno
drop field dummy; //this field was only needed to create the temporary table
drop table table2; //drop the orgiginal table, otherwise we would have
//a lot of synthetic keys
//now fix the table and create the [Key] field
table2:
load *, [Doc. No.]&[Row. No.]&[Description] as Key
resident newTable2
where len([Doc. No.])>0 //this avoids the blank row generated
//by the first dummy insert
;
//this was a temporary table that we generated and it should now be dropped
drop table newTable2;
Works perfectly. I couldn't figure out the logic. Makes perfect sense seeing it work. Thank you
– AgentShwang
Nov 16 '18 at 13:45
add a comment |
If I understand the issue correctly, what you want to do is to start with [Row. No.] 1 for every new value in [Doc. No.], and increment the field [Row. No.] by 1 as long as [Doc. No.] is the same as the previous row. One way to accomplish that could be the following:
//load the table, in this example it is an inline table
//but you would load it from another source
table2:
load * inline [
Doc. No.,Description,Qty,Batch,Value
Doc 1, Item1, 1, 10
Doc 1, Item1, 2, 10
Doc 1, Item1, 3, 10
Doc 2, Item2, 1, 20
Doc 3, Item3, 1, 30
Doc 3, Item3, 1, 30
Doc 3, Item3, 1, 30
];
//define an empty table that into which values can be "concatenateloaded"
newTable2:
load * inline [
dummy
1
];
//define starting values to be used in the loop
let rownodigit = 0;
let lastdocno = '';
//Iterate over each row in the table, and use the values to build
//the new table row for row. If the table is large this would
//require a significant amount of time...
FOR rowno = 0 TO noOfRows('table2')-1
let docno = peek('Doc. No.', rowno, 'table2');
let desc = peek('Description', rowno, 'table2');
let qty = peek('Qty', rowno, 'table2');
let batch = peek('Batch', rowno, 'table2');
let value = peek('Value', rowno, 'table2');
//determine what value the [Row. No.] field is to be given
if docno=lastdocno then
rownodigit = rownodigit + 1;
else
rownodigit = 1
endif
//build the table by generating a new row into the new table
concatenate (newTable2)
load
'$(docno)' as [Doc. No.],
'$(desc)' as [Description],
$(qty) as [Qty],
$(batch) as [Batch],
$(value) as [Value],
$(rownodigit) as [Row. No.]
autogenerate (1)
;
let lastdocno = docno; //store the value of docno into a new variable
//for comparison in the top of the loop in
//the next iteration
NEXT rowno
drop field dummy; //this field was only needed to create the temporary table
drop table table2; //drop the orgiginal table, otherwise we would have
//a lot of synthetic keys
//now fix the table and create the [Key] field
table2:
load *, [Doc. No.]&[Row. No.]&[Description] as Key
resident newTable2
where len([Doc. No.])>0 //this avoids the blank row generated
//by the first dummy insert
;
//this was a temporary table that we generated and it should now be dropped
drop table newTable2;
Works perfectly. I couldn't figure out the logic. Makes perfect sense seeing it work. Thank you
– AgentShwang
Nov 16 '18 at 13:45
add a comment |
If I understand the issue correctly, what you want to do is to start with [Row. No.] 1 for every new value in [Doc. No.], and increment the field [Row. No.] by 1 as long as [Doc. No.] is the same as the previous row. One way to accomplish that could be the following:
//load the table, in this example it is an inline table
//but you would load it from another source
table2:
load * inline [
Doc. No.,Description,Qty,Batch,Value
Doc 1, Item1, 1, 10
Doc 1, Item1, 2, 10
Doc 1, Item1, 3, 10
Doc 2, Item2, 1, 20
Doc 3, Item3, 1, 30
Doc 3, Item3, 1, 30
Doc 3, Item3, 1, 30
];
//define an empty table that into which values can be "concatenateloaded"
newTable2:
load * inline [
dummy
1
];
//define starting values to be used in the loop
let rownodigit = 0;
let lastdocno = '';
//Iterate over each row in the table, and use the values to build
//the new table row for row. If the table is large this would
//require a significant amount of time...
FOR rowno = 0 TO noOfRows('table2')-1
let docno = peek('Doc. No.', rowno, 'table2');
let desc = peek('Description', rowno, 'table2');
let qty = peek('Qty', rowno, 'table2');
let batch = peek('Batch', rowno, 'table2');
let value = peek('Value', rowno, 'table2');
//determine what value the [Row. No.] field is to be given
if docno=lastdocno then
rownodigit = rownodigit + 1;
else
rownodigit = 1
endif
//build the table by generating a new row into the new table
concatenate (newTable2)
load
'$(docno)' as [Doc. No.],
'$(desc)' as [Description],
$(qty) as [Qty],
$(batch) as [Batch],
$(value) as [Value],
$(rownodigit) as [Row. No.]
autogenerate (1)
;
let lastdocno = docno; //store the value of docno into a new variable
//for comparison in the top of the loop in
//the next iteration
NEXT rowno
drop field dummy; //this field was only needed to create the temporary table
drop table table2; //drop the orgiginal table, otherwise we would have
//a lot of synthetic keys
//now fix the table and create the [Key] field
table2:
load *, [Doc. No.]&[Row. No.]&[Description] as Key
resident newTable2
where len([Doc. No.])>0 //this avoids the blank row generated
//by the first dummy insert
;
//this was a temporary table that we generated and it should now be dropped
drop table newTable2;
If I understand the issue correctly, what you want to do is to start with [Row. No.] 1 for every new value in [Doc. No.], and increment the field [Row. No.] by 1 as long as [Doc. No.] is the same as the previous row. One way to accomplish that could be the following:
//load the table, in this example it is an inline table
//but you would load it from another source
table2:
load * inline [
Doc. No.,Description,Qty,Batch,Value
Doc 1, Item1, 1, 10
Doc 1, Item1, 2, 10
Doc 1, Item1, 3, 10
Doc 2, Item2, 1, 20
Doc 3, Item3, 1, 30
Doc 3, Item3, 1, 30
Doc 3, Item3, 1, 30
];
//define an empty table that into which values can be "concatenateloaded"
newTable2:
load * inline [
dummy
1
];
//define starting values to be used in the loop
let rownodigit = 0;
let lastdocno = '';
//Iterate over each row in the table, and use the values to build
//the new table row for row. If the table is large this would
//require a significant amount of time...
FOR rowno = 0 TO noOfRows('table2')-1
let docno = peek('Doc. No.', rowno, 'table2');
let desc = peek('Description', rowno, 'table2');
let qty = peek('Qty', rowno, 'table2');
let batch = peek('Batch', rowno, 'table2');
let value = peek('Value', rowno, 'table2');
//determine what value the [Row. No.] field is to be given
if docno=lastdocno then
rownodigit = rownodigit + 1;
else
rownodigit = 1
endif
//build the table by generating a new row into the new table
concatenate (newTable2)
load
'$(docno)' as [Doc. No.],
'$(desc)' as [Description],
$(qty) as [Qty],
$(batch) as [Batch],
$(value) as [Value],
$(rownodigit) as [Row. No.]
autogenerate (1)
;
let lastdocno = docno; //store the value of docno into a new variable
//for comparison in the top of the loop in
//the next iteration
NEXT rowno
drop field dummy; //this field was only needed to create the temporary table
drop table table2; //drop the orgiginal table, otherwise we would have
//a lot of synthetic keys
//now fix the table and create the [Key] field
table2:
load *, [Doc. No.]&[Row. No.]&[Description] as Key
resident newTable2
where len([Doc. No.])>0 //this avoids the blank row generated
//by the first dummy insert
;
//this was a temporary table that we generated and it should now be dropped
drop table newTable2;
edited Nov 13 '18 at 16:31
answered Nov 13 '18 at 14:18
henqlikhenqlik
312
312
Works perfectly. I couldn't figure out the logic. Makes perfect sense seeing it work. Thank you
– AgentShwang
Nov 16 '18 at 13:45
add a comment |
Works perfectly. I couldn't figure out the logic. Makes perfect sense seeing it work. Thank you
– AgentShwang
Nov 16 '18 at 13:45
Works perfectly. I couldn't figure out the logic. Makes perfect sense seeing it work. Thank you
– AgentShwang
Nov 16 '18 at 13:45
Works perfectly. I couldn't figure out the logic. Makes perfect sense seeing it work. Thank you
– AgentShwang
Nov 16 '18 at 13:45
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53278967%2fqlikview-insert-a-recno-for-each-change-in-a-column%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown