ArrayType(StringType) to IntegerType conversion in spark dataframe










-1















I'm trying to groupBy column name host and aggregate average of a column of type ArrayType(StringType) after type casting it to ArrayType(IntegerType).



It throws below error



 `cannot resolve `avg(variables)` due to datatype mismatch: function average requires numeric types, not ArrayType(IntegerType,true);


Input Data - sample Dataframe before grouping



|request|time |type |host |service | variables |
|REST |1542111483170|RESTFUL|KAFKA|www.google.com|[Duration, 7,Type] |
|REST |1542111486570|RESTFUL|KAFKA|www.google.com|[Duration, 9, Type]|


How to cast or handle ArrayType(StringType) to IntegerType i.e Column variables is of ArrayType(varaible.variable:String,varaible.value:String,varaible.TypeString) I want to convert 2nd value of Array varaible.value to Integer for Aggregation (Average calculation)?



Case Class:



 case class ServiceActivity(val request: String, val time: Long, val Type: String, val host: String, val service: String, val variables: Array[Variables])

case class Variables(val variable: String, val value: String, val Type: String)


Code Below:



val report = df.select("*").where(array_contains(df("variables.variable"),"Duration"))
val intermediate = report.withColumn("variables", col(variables.value).cast(org.apache.spark.sql.types.ArrayType(org.apache.spark.sql.types.IntegerType,true)
intermediate.withColumn("duration",$"variables".getItem(2)).drop("variables").withColumnRenamed("duration","variables")


GroupBy Code(error):



 intermediate.groupBy(intermediate("host")).agg(Map("variables"->"avg"))


Any workarounds.



Thanks










share|improve this question
























  • can you share the schema of dataframe df ?

    – Shankar Koirala
    Nov 14 '18 at 8:09












  • Question Updated

    – Vicky
    Nov 14 '18 at 8:14











  • @ShankarKoirala, Any Updates.

    – Vicky
    Nov 14 '18 at 10:14











  • Could you add more detials on how you are inputing Variables dataframe and combining it with ServiceActivity dataframe? Sample code and data would be helpful to replicate the problem.

    – user238607
    Nov 14 '18 at 12:35











  • Can you please update in detail with input data and required output data

    – Shankar Koirala
    Nov 14 '18 at 13:23
















-1















I'm trying to groupBy column name host and aggregate average of a column of type ArrayType(StringType) after type casting it to ArrayType(IntegerType).



It throws below error



 `cannot resolve `avg(variables)` due to datatype mismatch: function average requires numeric types, not ArrayType(IntegerType,true);


Input Data - sample Dataframe before grouping



|request|time |type |host |service | variables |
|REST |1542111483170|RESTFUL|KAFKA|www.google.com|[Duration, 7,Type] |
|REST |1542111486570|RESTFUL|KAFKA|www.google.com|[Duration, 9, Type]|


How to cast or handle ArrayType(StringType) to IntegerType i.e Column variables is of ArrayType(varaible.variable:String,varaible.value:String,varaible.TypeString) I want to convert 2nd value of Array varaible.value to Integer for Aggregation (Average calculation)?



Case Class:



 case class ServiceActivity(val request: String, val time: Long, val Type: String, val host: String, val service: String, val variables: Array[Variables])

case class Variables(val variable: String, val value: String, val Type: String)


Code Below:



val report = df.select("*").where(array_contains(df("variables.variable"),"Duration"))
val intermediate = report.withColumn("variables", col(variables.value).cast(org.apache.spark.sql.types.ArrayType(org.apache.spark.sql.types.IntegerType,true)
intermediate.withColumn("duration",$"variables".getItem(2)).drop("variables").withColumnRenamed("duration","variables")


GroupBy Code(error):



 intermediate.groupBy(intermediate("host")).agg(Map("variables"->"avg"))


Any workarounds.



Thanks










share|improve this question
























  • can you share the schema of dataframe df ?

    – Shankar Koirala
    Nov 14 '18 at 8:09












  • Question Updated

    – Vicky
    Nov 14 '18 at 8:14











  • @ShankarKoirala, Any Updates.

    – Vicky
    Nov 14 '18 at 10:14











  • Could you add more detials on how you are inputing Variables dataframe and combining it with ServiceActivity dataframe? Sample code and data would be helpful to replicate the problem.

    – user238607
    Nov 14 '18 at 12:35











  • Can you please update in detail with input data and required output data

    – Shankar Koirala
    Nov 14 '18 at 13:23














-1












-1








-1








I'm trying to groupBy column name host and aggregate average of a column of type ArrayType(StringType) after type casting it to ArrayType(IntegerType).



It throws below error



 `cannot resolve `avg(variables)` due to datatype mismatch: function average requires numeric types, not ArrayType(IntegerType,true);


Input Data - sample Dataframe before grouping



|request|time |type |host |service | variables |
|REST |1542111483170|RESTFUL|KAFKA|www.google.com|[Duration, 7,Type] |
|REST |1542111486570|RESTFUL|KAFKA|www.google.com|[Duration, 9, Type]|


How to cast or handle ArrayType(StringType) to IntegerType i.e Column variables is of ArrayType(varaible.variable:String,varaible.value:String,varaible.TypeString) I want to convert 2nd value of Array varaible.value to Integer for Aggregation (Average calculation)?



Case Class:



 case class ServiceActivity(val request: String, val time: Long, val Type: String, val host: String, val service: String, val variables: Array[Variables])

case class Variables(val variable: String, val value: String, val Type: String)


Code Below:



val report = df.select("*").where(array_contains(df("variables.variable"),"Duration"))
val intermediate = report.withColumn("variables", col(variables.value).cast(org.apache.spark.sql.types.ArrayType(org.apache.spark.sql.types.IntegerType,true)
intermediate.withColumn("duration",$"variables".getItem(2)).drop("variables").withColumnRenamed("duration","variables")


GroupBy Code(error):



 intermediate.groupBy(intermediate("host")).agg(Map("variables"->"avg"))


Any workarounds.



Thanks










share|improve this question
















I'm trying to groupBy column name host and aggregate average of a column of type ArrayType(StringType) after type casting it to ArrayType(IntegerType).



It throws below error



 `cannot resolve `avg(variables)` due to datatype mismatch: function average requires numeric types, not ArrayType(IntegerType,true);


Input Data - sample Dataframe before grouping



|request|time |type |host |service | variables |
|REST |1542111483170|RESTFUL|KAFKA|www.google.com|[Duration, 7,Type] |
|REST |1542111486570|RESTFUL|KAFKA|www.google.com|[Duration, 9, Type]|


How to cast or handle ArrayType(StringType) to IntegerType i.e Column variables is of ArrayType(varaible.variable:String,varaible.value:String,varaible.TypeString) I want to convert 2nd value of Array varaible.value to Integer for Aggregation (Average calculation)?



Case Class:



 case class ServiceActivity(val request: String, val time: Long, val Type: String, val host: String, val service: String, val variables: Array[Variables])

case class Variables(val variable: String, val value: String, val Type: String)


Code Below:



val report = df.select("*").where(array_contains(df("variables.variable"),"Duration"))
val intermediate = report.withColumn("variables", col(variables.value).cast(org.apache.spark.sql.types.ArrayType(org.apache.spark.sql.types.IntegerType,true)
intermediate.withColumn("duration",$"variables".getItem(2)).drop("variables").withColumnRenamed("duration","variables")


GroupBy Code(error):



 intermediate.groupBy(intermediate("host")).agg(Map("variables"->"avg"))


Any workarounds.



Thanks







apache-spark apache-spark-sql






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 15 '18 at 5:33







Vicky

















asked Nov 14 '18 at 8:01









VickyVicky

97




97












  • can you share the schema of dataframe df ?

    – Shankar Koirala
    Nov 14 '18 at 8:09












  • Question Updated

    – Vicky
    Nov 14 '18 at 8:14











  • @ShankarKoirala, Any Updates.

    – Vicky
    Nov 14 '18 at 10:14











  • Could you add more detials on how you are inputing Variables dataframe and combining it with ServiceActivity dataframe? Sample code and data would be helpful to replicate the problem.

    – user238607
    Nov 14 '18 at 12:35











  • Can you please update in detail with input data and required output data

    – Shankar Koirala
    Nov 14 '18 at 13:23


















  • can you share the schema of dataframe df ?

    – Shankar Koirala
    Nov 14 '18 at 8:09












  • Question Updated

    – Vicky
    Nov 14 '18 at 8:14











  • @ShankarKoirala, Any Updates.

    – Vicky
    Nov 14 '18 at 10:14











  • Could you add more detials on how you are inputing Variables dataframe and combining it with ServiceActivity dataframe? Sample code and data would be helpful to replicate the problem.

    – user238607
    Nov 14 '18 at 12:35











  • Can you please update in detail with input data and required output data

    – Shankar Koirala
    Nov 14 '18 at 13:23

















can you share the schema of dataframe df ?

– Shankar Koirala
Nov 14 '18 at 8:09






can you share the schema of dataframe df ?

– Shankar Koirala
Nov 14 '18 at 8:09














Question Updated

– Vicky
Nov 14 '18 at 8:14





Question Updated

– Vicky
Nov 14 '18 at 8:14













@ShankarKoirala, Any Updates.

– Vicky
Nov 14 '18 at 10:14





@ShankarKoirala, Any Updates.

– Vicky
Nov 14 '18 at 10:14













Could you add more detials on how you are inputing Variables dataframe and combining it with ServiceActivity dataframe? Sample code and data would be helpful to replicate the problem.

– user238607
Nov 14 '18 at 12:35





Could you add more detials on how you are inputing Variables dataframe and combining it with ServiceActivity dataframe? Sample code and data would be helpful to replicate the problem.

– user238607
Nov 14 '18 at 12:35













Can you please update in detail with input data and required output data

– Shankar Koirala
Nov 14 '18 at 13:23






Can you please update in detail with input data and required output data

– Shankar Koirala
Nov 14 '18 at 13:23













1 Answer
1






active

oldest

votes


















0














Sorted out by splitting the Array and using concat_ws method



 val splitArray = ab.withColumn("Avg_Duration", concat_ws("", ab("variables.value")))

splitArray.groupBy(splitArray("host")).agg(Map("Avg_Duration" -> "avg")).show(false)


Thank you






share|improve this answer






















    Your Answer






    StackExchange.ifUsing("editor", function ()
    StackExchange.using("externalEditor", function ()
    StackExchange.using("snippets", function ()
    StackExchange.snippets.init();
    );
    );
    , "code-snippets");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "1"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );













    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53295454%2farraytypestringtype-to-integertype-conversion-in-spark-dataframe%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0














    Sorted out by splitting the Array and using concat_ws method



     val splitArray = ab.withColumn("Avg_Duration", concat_ws("", ab("variables.value")))

    splitArray.groupBy(splitArray("host")).agg(Map("Avg_Duration" -> "avg")).show(false)


    Thank you






    share|improve this answer



























      0














      Sorted out by splitting the Array and using concat_ws method



       val splitArray = ab.withColumn("Avg_Duration", concat_ws("", ab("variables.value")))

      splitArray.groupBy(splitArray("host")).agg(Map("Avg_Duration" -> "avg")).show(false)


      Thank you






      share|improve this answer

























        0












        0








        0







        Sorted out by splitting the Array and using concat_ws method



         val splitArray = ab.withColumn("Avg_Duration", concat_ws("", ab("variables.value")))

        splitArray.groupBy(splitArray("host")).agg(Map("Avg_Duration" -> "avg")).show(false)


        Thank you






        share|improve this answer













        Sorted out by splitting the Array and using concat_ws method



         val splitArray = ab.withColumn("Avg_Duration", concat_ws("", ab("variables.value")))

        splitArray.groupBy(splitArray("host")).agg(Map("Avg_Duration" -> "avg")).show(false)


        Thank you







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Nov 15 '18 at 5:34









        VickyVicky

        97




        97





























            draft saved

            draft discarded
















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53295454%2farraytypestringtype-to-integertype-conversion-in-spark-dataframe%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            這個網誌中的熱門文章

            How to read a connectionString WITH PROVIDER in .NET Core?

            Node.js Script on GitHub Pages or Amazon S3

            Museum of Modern and Contemporary Art of Trento and Rovereto