How I qualified 200 leads in 1 minute with ChatGPT and Google Sheets

How I qualified 200 leads in 1 minute with ChatGPT and Google Sheets

Edit: For anyone using ChatGPT inside google sheets, make sure to save the result and remove the formula or else it will refresh the contents of the spreadsheet and continuously make new API calls!

We need to start with a disclaimer, this is not a fully automated process and the tools that I used are a suggestion and I am not sponsored by any of them, feel free to replace them as you see fit, this is just the stack that I used. I can already see improvements to this workflow and more automation opportunities.

Goal: I have ±200 event registrations and I would like to screen all of the event participants to follow-up and invite to a call.

Problem 1: I don't have LinkedIn premium or LinkedIn API (I know I have the badge but for this experiment I didn't use it)

Problem 2: LinkedIn REALLY REALLY doesn't like to be scraped and uses dynamic element identifiers that change every time you load the page.

Problem 3: Downloading the event registration list from the LinkedIn event gives me their names and emails but not their LinkedIn profile.

Here is a step by step guide on how to set up your own automation using 3 tools:

OpenAI API, 谷歌 Sheets, Bardeen (or any other scraper that you like).

What is the workflow?

  1. Get the name and LinkedIn profile from the event registration page (need your scraper to be able to click next page) and save it in a Google Sheet.
  2. The LinkedIn link will have a bit of extra data at the end, we need to clean that up.
  3. If you're interested in their full experience (as I was) or Education, we need to add [/details/experience] to their profile link. Otherwise leave the link as is.
  4. Run the list of links through a scraper again and this time scrape the entire page. This will include a lot of extra data and is not "usable by a human."
  5. Send the scraped data to GPT 3.5 API to clean it up in whatever output we desire.
  6. Use GPT 3.5 as many times as you want on either the clean data or the original data to derive your insights.
  7. Get connecting.




Let's now look in detail on how to set it up.

Optional: get the list of participants for our reference as it also includes their email address and could prove in handy later. Go to your event and click Download Registration Info:

No alt text provided for this image

You can also use this spreadsheet as your working spreadsheet going forward or you can work in a different one. I worked in this same one just to keep everything contained.

Step 1: Click the event participants list just above and go to the full list of attendees.

No alt text provided for this image
Blurred for privacy reasons of course

On this page is where we will want to run our scraper. Go to your preferred scraper of choice and set the scraping output to your Google Sheets of choice. If you're using the Registration Info sheet then make a new tab to keep everything clean.

You only need to scrape the Name and Profile Link (from the clickable name). That's all we need. If you need more help on how to set up a scraper please refer to the FAQ of your chosen scraper or the Bardeen Slack if you're using Bardeen.


Step 2: Cleaning up the links

Now that we got our list of Names and LinkedIn profile URLs we need to clean up the link.

No alt text provided for this image

You can see that there is more information in the link after the name. JaneDoe?miniProfile. We don't need the ' ? ' or anything after it. Lets clean that up and we can do it with a formula:

=LEFT(B1, FIND("?", B1) - 1)        

Paste this into Column C and watch the magic happen:

No alt text provided for this image

Now we have a clean LinkedIn link that we can use for data enrichment or import into our CRM later.

Note:

One thing that LinkedIn does is it crops someone's Work Experience or Education after a number of entries. If you're working with senior profiles you won't be able to see everywhere they worked.

You can see the button at the bottom Show all 8 Experiences ->

No alt text provided for this image

If we're interested to see all Work Experience we can add [/details/experience] or [/details/education] to the end of the link to get a full list. This will however exclude the remaining profile data. Use this as needed.

This is how the Experience page would look like:

No alt text provided for this image

For my purposes this is exactly what I'm looking for so I will be scraping this information going forward. You can also scrape the default page but you won't get all the data.

There are 2 ways to add /details/experience to the end of our links:


=C1 & "/details/experience"        

OR

=CONCATENATE(C1, "/details/experience")        

Use whichever one you prefer. I prefer to use Concatenate because it makes me feel like I know what I'm doing.

Now our list looks like this:

No alt text provided for this image

We will be using the links in Column D for the scraper, you can always scrape both.

If you don't want to use a scraper you always just manually copy the entire page and paste it into the Google Sheets page but if you're working with a lot of links that's obviously tedious.

No alt text provided for this image

Note: There's a lot of data and your row height will go out of control. You want to force the row height to default.

Click on Resize Rows

The select Specify Row Height 21px Default.






Step 3: Setting up OpenAI

You have your profile data. We're almost there. Now we need to allow Google Sheets to communicate with OpenAI. In order to do that we need 2 things.

  1. An Open AI API key. More information here. You will need to link a credit card to it.
  2. A script inside Google Sheets.

Here is how to setup the script:

No alt text provided for this image

Go to Extensions -> Apps Script




Here we want to create a new script, here is the code that you need to paste in:

const MAX_TOKENS = 256
function GPT(prompt, value1, value2) {
  var model = "gpt-3.5-turbo";
  var apiKey = "YOUR_API_KEY";
  var endpoint = "https://api.openai.com/v1/chat/completions";
  
  var messages = [
    {"role": "user", "content": prompt + " " + (value1 || "") + " " + (value2 || "")}
  ];
  
  var payload = {
    model: model,
    messages: messages,
    max_tokens: MAX_TOKENS
  };
  var options = {
    method: "post",
    contentType: "application/json",
    payload: JSON.stringify(payload),
    headers: {
      "Authorization": "Bearer " + apiKey
    }
  };
  var response = UrlFetchApp.fetch(endpoint, options);
  var json = response.getContentText();
  var data = JSON.parse(json);
  return data.choices[0].message.content.trim();
}

;        
!!!! Don't forget to insert your API key on line 4 !!!!

Click Save, allow any access it requests and click Run just to test that everything is okay.

What we want to do is use GPT 3.5 for Data Cleanup or simple tasks because it's 10 times cheaper than GPT4 or any other model. We force Max Tokens to 256 to keep the responses short (<200 words). If you have Access to GPT4 you can substitute the model name to "gpt-4", and remove const Max Tokens from the first line

Step 4: Enjoy!

Now that we have the scraped data and a way to communicate to OpenAI, we can do anything we want!

First thing I usually do is cleanup the data to only get back the information that I want. Here's how this script works

=GPT("prompt", cell1, cell2)

If you don't want to reference any cells you can just leave it blank and can send only a prompt =GPT("prompt")

Here is an example prompt to clean up the profile data:

=GPT("Below is a scraped page of a linkedin profile containing experience information. Remove all unnecessary data and return only the work experience, including title and company name: ", A2)        

Here is how it would look like in Google Sheets

No alt text provided for this image

Output:

No alt text provided for this image

Note: If you want to be able to double click into a cell and view the contents but not the formula, you need to select the column/cell, copy it and then cmd+shift+V or ctrl+shift+v or right click -> paste special -> paste values only.


Closing thoughts:

You can now use the formula to do anything you want with the scraped data. You can ask it to review it and return Yes or No based on contents of the profile. E.g. Is this person a Manager? Does this person work in a Technical Role? etc.

Important: you can only send about 10-20 requests at once. If you drag the formula along a lot of cells you will start getting #Error 429. This means you sent too many requests simultaneously. In that case delete the Error cell and drag the formula again. About 10-20 at a time should be safe.

Important 2: There is an approximate 3000 word limit per request. If there is more than that in the profile data you will get #Error 400. There's not much you can do here except manually go into the scraped cell and delete extra information in hopes that it's enough, or use GPT4 that has a higher word limit.


Hope this walkthrough helps you as much as it helped me!

Julien Brault

Abonnez-vous à mon infolettre gratuite Global Fintech Insider

3 周

Great read!

回复
Karim Elloumi

Co-Founder & Business Development Manager at Tendbe

1 年
回复

Sorin, your post about ChatGPT and the advancements of ChatGPT4 is fascinating! The continuous development of AI models like ChatGPT4 brings exciting possibilities for natural language understanding and communication.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了