I’m working on a scenario that will ideally achieve the following import, data sort, aggregation, and export task.
Instructions:
Fetch four different supplier CSVs from four different folders on the same FTP.
Collect EAN, price, stock, title and supplier name from each CSV (column names and locations differ on each file, so may need to be assigned manually).
A duplicate EAN search is then needed across all CSVs. If multiple suppliers are selling the same EAN, the module should assign the cheapest supplier’s pricing and stock to the aggregated output file (and exclude other suppliers that share the same EAN). This task is not necessary for duplicates within the same supplier file.
The uniformly structured, aggegated file should then be uploaded to a specific folder on the FTP directory.
Do you have FTP directories to get and post files?
I would recommend building the process using the data storage module. to store data there and perform the checks that are needed. If this is cost-inefficient, you can use any other data storage service, like Gsheets or Airtable. and form there you can extract your data results and upload them on your server.
I am developing in live conversation so you can learn how the process works, and also, at the end of the conversation, you get your solution. Happy to schedule a 30-minute meeting with no commitment to share my thoughts in your exact process. Here is my calendar link: Solve it in 30 mins | Dimitris Goudis | Cal.com