Simple CSV export from Data Store

:bullseye: What is your goal?

Pull filtered data from a Data Store and export to CSV

:thinking: What is the problem & what have you tried?

When I did get data coming through, it wrote a separate file for each entry in the data store. So if I have 4 entries to export I get 4 separate files. Have tried Create CSV, Array Aggregator

I have a data store that is populated with data from Tally and that is working fine.

I’ve created another scenario to run once a day that will pull filtered data from that data store and export to a CSV file.

I’m using Google Drive for testing this just now but it will end up in a S3 bucket once working properly.

I’ve tried using Parse CSV, Create CSV (normal and advanced) and also using the Array Aggregator but all I can get is a separate CSV file for each entry with just the one line in it.

So if I have 4 entries to export then I get 4 separate files with one line in each. I need all four entries to be on one CSV file.

This should be a very simple task yet I can’t find a way to make it work. Any ideas where I may be going wrong here?

Many thanks

1 Like

Hello @gazza35 i can clearly understand what’s going on here you’re just missing one key step:

How about you try this:

  1. Data store - Search records (Which returns multiple bundles)
  2. Add Array Aggregator: From Data Store (Search Records) Map the fields you want (e.g. name, email, etc. This combines ALL records into one array
  3. Create CSV ( In rows field: select array aggregator)
  4. Upload file
    If you need further assistance let me know and ensure to give a feedback

Hi @Bolex_creatives

Thanks for the reply

I’m pretty sure there’s something wrong with my setup in the Array Aggregator. Everything I’ve seen online seems to mention selecting the target module but the only option I get there is ā€˜Custom’

If this is referring to a data structure then I have one saved but it’s not offering me that as an option.

In the Aggregated Fields section I’m choosing all four entries there, is this correct? It doesn’t seem to have much of an effect whatever I choose there.

Also in the Group By section I’m also unusure what exactly I should be picking here. I’ve tried picking all the fields that I want on my output but again it doesn’t seem to change anything on the output. Also tried choosing ā€˜record’ and ā€˜total number of bundles’ but again doesn’t seem to fix anything.

The only way I’ve managed to get anything to output is to go directly to the Google Drive Upload a File module and choose the fields I want on the output from there (also manually putting the headers in the output field) but that is when I get a separate file for each entry in the datastore.

If I use the Create CSV module before trying the output to Google Drive, I get these options. I’ve tried just array and then both key and array:

But then the only options I get for data fields in the Google Drive Upload A File module are these and it gives me a load of random data in the output which doesn’t work:

Also in the Create CSV module is ā€œInclude Headers in the First Rowā€ yet if I choose ā€œyesā€ I get no option to specify the headers and have to manually put them in the data section of the output module to Google Drive.

I’m clearly doing something wrong here or something is missing but I’m struggling to get my head around how something which should be a simple task is presenting so many problems.

Apologies for the long reply but hope it gives a picture of what I’m trying to do, and/or what I’m doing wrong.

Many thanks

Hey there,

the Create CSV module already works as a an aggregator. You don’t need the array aggregator as well.

The reason you are getting only one item is that inside the CSV, you mapped the first item from the resulting array. But this is irrelevant, since you can just delete the array aggregator.

Mate next time AI generates the answer for you, can you at least verify it first before blindly copy-pasting it here? This is blindly wrong.

Hi @Stoyan_Vatov

Many thanks for responding

So I’ve just removed the Array Aggregator as you suggested and have a simple scenario now with just the three modules:

When I run this and look at the inputs/outputs from the Create CSV module I get this:

I can see the three input bundles and all the data correctly which is great.

On the output there is only one bundle but it appears to have aggregated the three inputs into one output which I am presuming is correct. However all the data for the various fields that are in the input don’t appear to be in the output other than the ā€˜Key’ field

In the Create CSV module options I am choosing all the fields presented in the ā€˜Aggregated Fields’ dropdown. Is this correct?

When I look in the Google Drive Upload A File module to pick my data to output I’m just getting these options now and also there doesn’t seem to be a way of adding headers to the output file other than hardcoding them in the Data field as I’ve done here:

If I choose either of those output items I just end up with nonsense in the file instead of my data:

It appears the data isn’t transferring correctly in the Create CSV module but I don’t understand why. It’s all there in the input side but not on the output.

Very confusing!

Thanks again for your help

Oh sorry yes. Since the Search module returns a collection, you will need to use the Create CSV advanced module. With it you should be able to use the same data structure that the Data store uses (you should be able to select it from the drop down). Then just map each corresponding value to create the proper CSV.

2 Likes

Thank you so much @Stoyan_Vatov

I’ve finally been able to get it to output multiple records to one file using the Create CSV Advanced module, absolutely incredible!

Just got another small query though, maybe you can help.

I have some dates stored as part of the records in my data store. I need to output them in YYYY-DD-MM format but when it does the output the date looks like this: ā€œ2026-04-04T23:00:00.000Zā€

I’ve tried using {{formatDate(DOB; YYYY-MM-DD)}} in the data structure mapping in the Create CSV module but whatever format I change it to, it just outputs exactly as above.

Is this the correct place to by trying to format the date? I can’t see anywhere else obvious to try and do this. Not sure why it’s not working as the source data in my data store is set to date format and I was able to use the same formatdate function successfully when I was using the data fields in the Google Drive output module previously when I had it set up wrongly and was getting individual files.

I even tried inserting a Tools > Set a Variable module between the datastore and Create CSV and tried formatting the date there and using the formatted variable as the output in Create CSV but again am getting the exact same date format as the output, ie ā€œ2026-04-04T23:00:00.000Zā€

Any ideas on how to get it so it just outputs YYYY-MM-DD for the date and no timestamp?

Thanks again for all your help, it is really appreciated

Happy you got it solved just coming online

Uf yeah. You need to change the data structure and have that field be text and not date. Otherwise it will always convert it and output the full stamp.

You can do that for the full data structure or create a separate identical data structure for the csv with the dob field as text instead of date

Brilliant!

I had tried changing the datatype in the data structure earlier but was getting errors.

I think my mistake was in not parsing the data first.

I’ve just tried using {{formatDate(parseDate(DOB; YYYY-MM-DD); YYYY-MM-DD)}} and it works great so thanks again for pointing me in the right direction.

Thanks again @Stoyan_Vatov for all your help with this

I have one final question if you can help as this is something else I can’t figure out how to get the selection data for

In this flow the original data source is a Data Store. In that Data Store I have a column called ā€˜processed’ and this is set to false by default. I have a filter on the scenario so it only picks up records that have ā€˜false’ as the value against this column. This all works fine.

Once the CSV file is outputted, I would like to update the records that have been exported so that the processed column is set to ā€˜true’ so they don’t get picked up in the next export. Also I’ve added a ā€˜processed_at’ date field so I can put a timestamp on when they were exported.

I’ve added a Data Store Update A Record module at the end after the export to Google Drive and have set the processed field to ā€˜true’ and the processed_at date field to the date token ā€˜now’.

The issue I’m having is that I can’t find a way of it selecting just the records that have been exported to the CSV. There are two unique fields in the data store, one is the Key (which isn’t exported to the CSV) and the other is a barcode field which is exported. Either one of these as the selection criteria would work.

When I try to select the key for it to recognise which records to update, I get the fields from the Google Drive upload (nothing seems relevant there) and the aggregated output from the CSV module:

I’m not sure how I can therefore identify which records to update.

At present it’s just writing a spurious entry into the datastore based on the output fields of the CSV.

The only way I could think that might work would be to have a router on the initial Data Store Search Records module going straight to Data Store Update A Record but my concern then is that if something fails on the subsequent modules (Create CSV > Upload to Google Drive) it will still mark the entries as successfully processed then which is a risk.

I’m sure there is probably an easier way to handle this but I can’t seem to find anything after searching online.

Thanks again for all your help today

Yeah you add a router between the CSV module and the filter, then on the second route add the Update a Record module that would update the specific record coming from the search module with Processed = true and the time stamp.

Or you just put the module after the filter and before the CSV module. But if it errors out, it will break the entire flow. So the top option is better.

Many thanks @Stoyan_Vatov

I’ve just put the router between the initial Data Store query/filter and then have the original Create CSV > Export pathway as the 1st branch and then the 2nd branch I have the Data Store update which sets the processed=true flag.

I’ve just ran it and it works fine, I’m just not completely sure on the behaviour of the router module ie does it only run the 2nd pathway once the first has completed or does it run them in parallel together?

It would be perfect if it ran them in sequence so if an error happened then the data store on the 2nd pathway wouldn’t be updated with processed=true but not a huge problem if not as I would get a notification if something had failed to execute anyway and could fix manually

Many thanks

You would think so but no, it will run the aggregation last and the update first, regardless of the order on the router. But now that its on a separate path, you can easily add whatever error handling you want to make sure the main path always executes.

You can start with something simple like customizing the run name followed by an ignore. Then everything will still execute, even if there is an error and you will have an easy to spot place that an error occurred. Or have it send you a slack message or an email when an error happens. Then, you can built something more complex depending on what errors you are getting and how to handle the different ones.

But to be hones, Data Stores are built in Make features, so they are binary - they will either work or they wont. I mean either the entire service is broken and everything will give you an error, or everything will work and you wont be getting random errors.

Many thanks again @Stoyan_Vatov

I’m relatively new to make.com and still trying to figure out exactly how all the features work. I’ll definitely do some reading up on error handling and how the flows work with routers etc.

Very glad to have got the flow working with the CSV export though and now updating the flags too which is a brilliant start.

I can worry about the other bits down the line when I’ve learned bit more about how exactly everything works and fits together

You’ve been an amazing help in pointing me in the right direction to get this far, so many thanks indeed.

1 Like

@gazza35
That’s normal, ā€œCustomā€ just means nothing is preselected.

Set Source Module to your Data Store module, then manually map the fields into the array.

You don’t need to pick a target module, just make sure the output is one aggregated array, then pass that into Create CSV.