I would like to automate the downloading of a weekly report from my vendor’s website.
The general steps involved are:
- login to the site
- choose the reporting page
- choose the report to run
- update parameters for this weeks run
- run the report
- choose to download the report as a CSV file and name the downloaded file
Can anyone suggest the Make module(s) that I should be using to accomplish this task?
Apify Web Scraper + Make and a place for storage such as google drive should be able to do this.
The free tier of Apify should be enough for weekly 1 csv download, depends if the client portal has anti-scraping measures in place how hard this would be.
It is a bit of a technical solution though.
alternatively, if when you update the parameters, it also updates the URL, you could probably just pass the CSV download URL direct to make, with copied login cookies.
To do Some of this without Apify. (I Just re-confirmed this works in make)
This is an example of logging into rototgrinders.com
highlighted fields are the important ones.
Get Login Field ID’s (Keys)
1. Go to the sign in page and right click on the username field
2. Click Inspect in the Menu
3. you should go strait to the element, find its id
4. Copy the fields ID
5. do the same for password
then enable share cookies with all html modules.
you should now be logged in for the duration of your run.
Thanks for this example.
I’ve tried replicating this using a dummy login page
As you can see, getting a 404 error.