Csv parser error

Overview: I have built out this scenario to

  1. create an export request of listview using HTTP request
  2. Check status of the request and retrieve the data using HTTP requests ( I also currently have a pause in there as a workaround since i cant get a conditional loop to work(doesnt look like this can be done in make?)
  3. Next I want to Parse the data and add it to a Google sheet


Below is the Data that is getting passed through

“Record ID”,"Reporting Vertical ",“Contract Term Type”,“Scenario Flag”,“Close Date”,“Amount”,“Deal Name”,“Pipeline”,"Deal Source Attribution* ",“Deal Type”,“Create Date”,“Last Modified Date”
“17147406995”,"Payer Provider ",“Drop”,"Closed Won ",“2025-02-07 15:06”,“10000.0”,“Aetna - One-Time List”,“PL Sales Pipeline”,"AE Sourced ",“New Business”,“2024-01-24 20:47”,“2025-03-06 20:30”
“17890987799”,"Payer Provider ",“Drop”,“:red_square: Best Case”,“2025-03-31 14:44”,“35000.0”,“HiLabs Inc - Provider Affiliations”,“PL Sales Pipeline”,“Event”,“New Business”,“2024-03-06 16:40”,“2025-03-06 20:30”
“19736045898”,“Advertising”,“Subscription”,"Closed Won ",“2025-01-09 17:16”,“1200000.0”,“BGB Group - Platform and Claims Access (Enterprise)”,“PL Sales Pipeline”,"AE Sourced ",“New Business”,“2024-09-26 11:46”,“2025-03-06 20:30”
“20980757829”,“Advertising”,“Subscription”,"Closed Won ",“2025-01-06 10:37”,“40000.0”,“Everyday Health - Emails (Professionals Group)”,“PL Sales Pipeline”,"AE Sourced ",“New Business”,“2024-07-25 17:33”,“2025-03-06 20:30”
“21069669290”,"Life Science ",“Subscription”,"Closed Won ",“2025-01-07 16:49”,“396500.0”,"Currax - Obesity Brands (Mx/Rx Monthly Feed) ",“PL Sales Pipeline”,"AE Sourced ",“New Business”,“2024-07-30 10:35”,“2025-03-06 20:30”
“21134820487”,"Payer Provider ",“Subscription”,“:green_square: Worst Case”,“2025-03-31 11:01”,“20000.0”,“PatientGenie - New Deal”,“PL Sales Pipeline”,"AE Sourced ",“New Business”,“2024-08-02 10:42”,“2025-03-06 20:30”
“21228449887”,“Advertising”,“Subscription”,“:green_square: Worst Case”,“2025-03-31 12:15”,“150000.0”,“Cadent - Audiences and Measurement”,“PL Sales Pipeline”,"AE Sourced ",“New Business”,“2024-08-08 10:33”,“2025-03-07 09:42”

My Issue: I receive the error when passing through the data from the HTTP request

I have tried to upload the file to a googledrive then use it from there hoping that would help but still get the same error. Or I had the issue where I had the encoding issue but when I fixed that the data structure was still the exact same so I received the same error.

I have also tried the solve from this thread but this didnt work for me either.

Also to add based on my testing - I tried to have it just save to the google drive and convert to google sheet but it placed all the data just in one row. separated the columns correctly but still just one row that doesnt fit what i need

GOAL: I am just looking to download this list view then place the data in predetermined cells of a google sheet that I have already set up. Everything seems to work fine except getting the parser to work.

I feel like im at my wits end in trying everything and scouring these forums.

Hey there,

Not sure if this serves your purpose but you could try replace({{11.Data}};";{{emptystring}})?

1 Like

Thank you! This was perfect and worked right away!

I really though this was something else and not just an issue with it reading the quotes.

1 Like

hey @HarveyM

Thank you for the fix here but it looks like I have one more issue. Now that I have taken out the Quotes to fix this I just got a new error that the column amount does not match for “row 39”. When I checked the data I saw that there was a comma in the Deal name in this row.

I tried to change the json to do a different delimiter in the export from Hubspot but it looks like Hubspot does not allow me to change the delimiter with delimiter=“|” in the json.

Wasnt sure if I should make a new topic for this but wanted to post here first if you have a possible workaround.

First off my first solution was very hacky, this would be better to make all quotes equal:


Let me know if this is still an issue

Did a bit of experimenting and got a very ugly but functional solution to enforce carriage returns in your csv:


I am sure there is a better way, but hope that helps :slight_smile: