Tried below options and it is working out. Please check it.
1) First Map has been exploded.
2) Applied unbase64 based on Key conditions.
3) Then again exploded columns are changed to MAP. Grouped on primary key column.
val Query="""SELECT CAST(Id AS String) AS Primary_ID,key_index as key_index,case when key_index in ('x-insider-token','x-ent-auth') then CAST(unbase64( CAST(key_value AS STRING) ) AS STRING) else key_value end as value FROM kafka_avro_events LATERAL VIEW EXPLODE(map_value) as key_index, key_value """
val df1 = spark.sql(Query)
val result = df1.withColumn("map", map($"key_index", $"value")).groupBy("Primary_ID").agg(collect_list("map")).as[(String, Seq[Map[String, String]])].map { case (id, list) => (id, list.reduce(_ ++ _)) }.toDF("id", "gMap")
result.show(truncate = false)
+-------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
|id |gMap |
+-------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
|corrId1|Map(f-expaneded-port -> 451, custNo -> 415723, channel -> netsite, y-ub-eg-en-author -> {"decision":"PERMIT","authorized":true} , sessionid -> e5cdb71d3572dd6f7gh8jh6dssf8g688dda0, y-expandeded-proto -> https |
+-------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…