🤖 [Template] Scrape all Linkedin Profiles / Leads that comment under a Post to Google Sheet

I have found a nice way to identify prosepects on Linkedin.com. You can download the template at the bottom of this article.

For those wo want more detail, in this guide I will show you how I have setup a ‘Post comments scraper’ that loads all people that have been reacting on a competitors linkedin post right into your Google Sheets file.

To get started, you’ll need a few things. Of course you’ll need an account with Make.com. Also, we use Browserflow for the scraping part. So if you don’t already have a Browserflow account, you should make one here.

Step 1: Login to Browserflow and Download the Template

  1. Log In to Browserflow: Go to app.browserflow.io/login and log into your account.
  2. Get API Key: Go to the settings page and copy your API key.

3. Download the Linkedin Post Scraping Template:

  • Navigate to the Templates tab.
  • Search for “Get Comments from Linkedin Post” and click Download Template.

4. Store Cookies (First-Time Setup Only):

  • If this is the first time you are downloading a Linkedin Template, you are asked to save your login cookies. This is a requirement for this template.
  • If asked, click Get Required Cookies.
  • A login window will appear. Enter your Linkedin credentials and verify via any required security steps (like two-factor authentication).
  • Once logged in, click Save Login Cookies to start the download of your template.

Step 2: Import Blueprint into Make.com and Set Variables

  1. Login to Make.com: Go to your Make.com dashboard and open the Scenarios section.
  2. Create a New Scenario:
  • Click Create a new scenario at the top-right corner.
  • In the bottom-center toolbar, click the three dots and select Import Blueprint.
  • Upload the downloaded template file.
  • Your scenario should look like this:

3. Add Browserflow API Key:

  • Click on the first module, named “Start Browserflow Session.”
  • Select Add Connection, and paste your Browserflow API key.

4. Test the Setup:

  • First go to the post that you would like to scrape the comments under. Click the three dots (…) in the top right of the post and then select Copy link to post

  • Go to the Set Variables module and add the Post Url

  • Now click Run once to test the flow!

Step 3: Monitor the Bot in Browserflow

  1. Track Bot Progress:
  • Go to app.browserflow.io/advanced to view your session.
  • In the Your running bots overview, click View to see your bot’s progress in real-time. (This may take a minute to load.)
  • You can interact with the session as it runs, though the flow should proceed without manual intervention.

Step 4: Manage Results

Once ran, you will receive an output with multiple bundles. in the bundles is an array named “data”. This array contains all data. The data that has been retrieved is the name, the linkedin url, the image url, the persons tagline (summary), their comment, the reactions and replies and the relation with the person.

Step 5: Store Results in Google Sheets Or CRM

Depending on what you are trying to achieve you might want to store your results somewhere. You can attach this flow to Hubspot, Pipedrive or any other module that exists in make but for this example I’ll show you how to add the results to google sheets.

  1. Add Iterator:
  • First add an iterator after the last module and iterate over the ‘data’ output variable from the last module;

2. Copy Google Sheets Template:

3. Add Google Sheets Module in Make.com:

  • After the last module, add the Google Sheets “Add Rows” module.
  • Connect your Google account and select your ‘Linkedin Post Leads DEMO’ Spreadsheet.

4. Set Spreadsheet and Sheet ID:

  • If you can’t locate the file, select Enter Manually in the Search Method dropdown.
  • In your Google Sheets document, copy the Sheet ID from the URL and paste it into the Spreadsheet ID field.

5. Set Sheet Name & Column variables:

  • Set the Sheet Name to PostLeads and select column range A-Z
  • Then fill the output data fields from the iterator in the right column as shown in the picture:

Step 6: Run the Workflow

  1. Run the Flow:
  • Whenever you click Run once in your Make Editor, the flow will begin and your Google Sheets file should be filled.

Step 7: Iterate over multiple posts

  1. Add Google Sheets Search Rows module:
  • Right before the Set Custom Variables module add a Google Sheets Search Rows module.
  • Connect the module with the same google sheets but now point it to the PostUrls tab.
  • Add the following filters; PostUrl (A) exists and latest scraped at (B) does not exist

  • Click ok to shut the window.

2. Change Set Custom Variables Module

  • Change the value of the Post Url to the Post Url from the Google Sheets module. Your flow will now automatically itterate over all PostUrls that have been filled into the sheet.

3. Add Google Sheets Update Column Module

  • In order to make sure posts are not scraped twice we are going to add a timestamp to the posts that we have already scraped. For this, all the way at the end add a Google Sheets Update a Cell module
  • Connect with the same Google Sheets file again using the Spreadsheet ID and choose PostUrls as the Sheet Name.
  • Then in the Cell fill ‘B’ and the row number from the first Google Sheets (Search Rows) module.
  • Then in the value field grab the current time stamp using the standard make.com variables in the tab with the agenda icon

  1. Add Post Urls in Google Sheets
  • Now the last thing you will have to do is add the urls of the posts that you would like to scrape inside the google sheets at the PostUrls tab. And inside the PostUrl Column:

Step 8: Run the Workflow

  1. Run the Flow:
  • Whenever you click Run once in your Make Editor, the flow will begin and your Google Sheets file should be filled.

Pro Tip:

I have published two other guides that can be combined together with this flow to fully automate your outreach:

Happy Automating

By automating this process, you’ll find loads of valuable prospects using this Linekdin Posts scraper.

Remember to always stay mindful of Linkedin’s Terms of Service and use this automation responsibly to avoid any potential issues with account bans or restrictions.

Happy Automating!

Notes:

  • Linkedin Restrictions: Be cautious as Linkedin limits scraping and can block accounts that violate its terms.
  • API Limits: Ensure that your requests to Linkedin are limited to reasonable intervals to avoid overloading the servers.
  • For those who want to start right away, download the template here
2 Likes

Another useful addition:

You can use tools like Lusha, Apollo or Hunter.io to convert the resulting Linkedin urls to email addresses and other contact information.

Also, let me know if you would like to receive the template with the Google Sheets modules included already, i can send it to you!

1 Like

How does the workflow align with LinkedIn’s guidelines?

1 Like

Thanks for your reply @Josip_Vrbic. The workflow mimics human interactions that you would perform yourself at a similar pace. I recommend using it in a natural, human-like way, meaning that you don’t run it too frequently. Linkedin has limits about these type of interactions, so I recommend familiarizing yourself with LinkedIn’s policies and guidelines before using the workflow on a larger scale.