Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

data mapping

Hello,

I am trying to map salesforce object data to big query table. The mapping works for me if there are hand full of columns but the object I am mapping to big query has 500 fields which I wont be able to map manually. I tried to code it using loops but that wont work since they dont have same field names on both ends. Is there a way to automate the mapping?

Solved Solved
0 7 758
1 ACCEPTED SOLUTION

You can add additional variables in data transformer template output and it should get reflected in your logs. 

View solution in original post

7 REPLIES 7

Hi rish
Can you provide some examples of how data needs to be mapped? Is there any pattern or any convention for mapping field names?

You can follow Data Mapping Overview for options. You can either use Data Mapping Task or Data Transformer Script Task to map your data. 

Hi ,

for now I can deal with varied column names but I am this issue whereI have a output payload (json ) which to needs to be mapped to input payload (big query table ). But I need to filter the keys from output so that I can map only those fields that are present in input .Lets say my output is { "id" : 1, "name": "ads", "place": "ads"}  and input has only id and name in its properties. I will have to filter them (considering 500 columns to be filtered). I tried to use filter in data mapping but all it does is filter based on values in json but not keys. I tried to use data transformer script but most of it looks it its in preview. Do let me know if there are any other functions I can use.

It seems like Data Transformer would be a good choice based on your use case. However, you can also try JavaScript Task to perform the mapping

I did try data transformer , So What I did was looped over all the keys and tried to filter required ones ( created a string array with all required columns called bqColumns) but It doesnt work since filter.contains only takes strings but not string array. I am attaching SC of data map task . I did try Javascript there were some issues in syntax , I posted about it in different post (https://www.googlecloudcommunity.com/gc/Integration-Services/Javascript-in-application-integration/m...)Screenshot 2023-08-01 at 9.15.37 PM.png

It seems like you are referring to Data Mapping Task instead of Data Transformer Script Task. The Data Transformer script would look something like this for your usecase

local f = import "functions"; // Import additional functions

local bqColumns = std.extVar("bqColumns"); // Assuming value as ["a", "b"]
local loadBq = std.extVar("loadBq"); // Assuming value as {"a": 1, "foo": 2}

local newMapping = {
  [k]: loadBq[k],
  for k in std.objectFields(loadBq)
  if f.contains(bqColumns, k)
}; // This would return {"a": 1}

// TEMPLATE OUTPUT
{
   newMapping: newMapping // Mapping to new variable
}

My apologies, You are right. I tested my workflow with above code by defining a new Mapping variable to map to. and when I am mapping the new Mapping to output its basically empty even though the loadBq has data and bqColumns has similar columns to keys in loadbq. Is there a way I can log the variables to check where it is going wrong Screenshot 2023-08-01 at 9.58.23 PM.png

You can add additional variables in data transformer template output and it should get reflected in your logs. 

Top Labels in this Space