Data Transfer - Best Practices?

Hey everyone,

So i have finally started using make to automate our process in video creation. I Have made a terrible mistake and used up all the data transfer rates.

So this is the current flow

We look for files, Download a new file, Send it to cloud convert to convert to MP3, Then get chat gpt to write the transcript and script for the video. We then add it to our pipeline in notion.

Problem is, I upload maybe 3-15 bits of content daily to drive and this is going to eat our data transfer limits extremly fast (We used all the 5GB within 1 hour just testing and not realising)

What would be the best practise here? Without me exporting the videos, and the video as an mp3?

Thank you

Welcome to the Make community!

As you have found out, Make isn’t really ideal for video automation :slight_smile:

The thing with downloading videos, is that there is very large amounts of data transfer involved, and you can easily blow through your Organisation’s monthly data transfer quota if you’re not careful.

You’ll need to find some way to eliminate these two modules.

e7328cc89fe0d03920ad54cf8f6e0fcf8ef301a3

3 Likes

Thank you for your reply sam!!

Do you have any ideas on how i could achieve this ?

Practically, you’ll want to manually convert the video or export to MP3 using software on your computer, before uploading to Google Drive.

So you won’t be watching for new videos, but new MP3 files to process.

3 Likes