Hi all, I'm working with a csv file which I selected as a source file, and I want to filter duplicate record from that source file before it reaches the destination. If I have to use output filter then what kind of filter I should use and also if there is any other way to filter out the duplicate records from the source CSV file?
Hi Dibyojyoti,
For an FTP export, you can group rows together.
Example:
Before:
{
"page_of_records": [
{
"record": {
"Column0": "US",
"Column1": "CA",
"Column2": "2"
}
},
{
"record": {
"Column0": "US",
"Column1": "AZ",
"Column2": "3"
}
},
{
"record": {
"Column0": "US",
"Column1": "AR",
"Column2": "5"
}
},
{
"record": {
"Column0": "IN",
"Column1": "AR",
"Column2": "5"
}
},
{
"record": {
"Column0": "IN",
"Column1": "TS",
"Column2": "8"
}
}
]
}
After:
{
"page_of_records": [
{
"rows": [
{
"Column0": "US",
"Column1": "CA",
"Column2": "2"
},
{
"Column0": "US",
"Column1": "AZ",
"Column2": "3"
},
{
"Column0": "US",
"Column1": "AR",
"Column2": "5"
}
]
},
{
"rows": [
{
"Column0": "IN",
"Column1": "AR",
"Column2": "5"
},
{
"Column0": "IN",
"Column1": "TS",
"Column2": "8"
}
]
}
]
}
Let us know if this works for you. Feel free to revert back if you have any queries.
Thanks @dibyojyotisamanta for the post and @sachingoud for the prompt response!
(I've moved this post from "Welcome to our community" to "Troubleshoot custom flows")
Hi Sachin, Thanks for your valuable reply, It worked.