Add Your Heading Text Here

Finding the Right YouTube Influencers using Scrapingdog’s Google SERP API & YouTube Channel Scraper

Finding the Right YouTube Influencers using Scrapingdog's Google SERP API & YouTube Channel Scraper

Table of Contents

TL;DR

  • Use Scrapingdog Google SERP API with query "site:youtube.com/@ finance" to fetch ≈100 channel links.
  • For each, hit /youtube/channel (channel_id) to get subs, views, and videos (JSON).
  • Python demo loops results and saves channels.csv via pandas.
  • Outcome: automated shortlist of finance-niche YouTube influencers.

If you’re a marketer promoting a product, chances are you’ve looked into YouTube influencers.

But finding the right influencer, with a niche audience, good engagement, and enough reach, can be frustrating and slow. You open YouTube. Type a few keywords. Scroll endlessly. Then, open each channel one by one to check its stats.

Thankfully, you don’t have to do this manually anymore

Let me show you how to automate the entire process using Scrapingdog’s APIs, specifically the Google SERP API and the YouTube Channel Scraper.

What We’re Building

  1. Search for relevant YouTube channels in the finance niche.
  2. Extract their channel URLs using the Google Search API.
  3. Get their subscriber count, video counts, and more using the YouTube Channel API.
  4. Filter and shortlist the best influencers for your campaign.
Data Flow

Tools You’ll Need

Libraries

  • Requests– For making an HTTP connection with the APIs
  • Pandas– For saving the data into a CSV file.

Setup

  • Create a folder by any name you like. This is where we will keep our Python files.
  • Create a Python file. I am naming the file as channel.py.
  • Install requests and pandas using pip- pip install requests pandas

Discover YouTube Channels

In this step, we’ll use the SERP API and pass the following query.

				
					site:youtube.com/@ finance

				
			

Before we begin with the coding, it’s recommended to review the SERP API documentation for better clarity and a smoother implementation.

Once you are done with that, we can now proceed with scraping the search results.

Dashboard

You can simply copy the Python code from the dashboard itself by passing the above query.

				
					import requests

api_key = "your-api-key"
url = "https://api.scrapingdog.com/google"

params = {
    "api_key": api_key,
    "query": "site:youtube.com/@ finance",
    "results": 100,
    "country": "us",
    "page": 0,
    "advance_search": "false"
}

response = requests.get(url, params=params)

if response.status_code == 200:
    data = response.json()
    print(data)
else:
    print(f"Request failed with status code: {response.status_code}")
				
			

Once you run this code, you will get a JSON result like this.

JSON Response

This response contains 100 YouTube channel links. Now, we have to pass these channel links(channel ID) to the YouTube channel scraper API.

This response contains 100 YouTube channel links. Now, we have to pass these channel links(channel ID) to the YouTube channel scraper API.

Extract YouTube Channel Insights

Now, we have to extract the details of each channel by passing the ID of each channel to the API. Before we proceed with this section, you should read the API documentation.

				
					u={}
l=[]

if response.status_code == 200:
    data = response.json()
    for i in range(0,len(data['organic_results'])):
        channel_data=requests.get('https://api.scrapingdog.com/youtube/channel?api_key=your-api-key&channel_id={}'.format(data['organic_results'][i]['link'].replace("https://www.youtube.com/",""))).json()

        try:
            u['subscribers']=channel_data['about']['subscribers']
        except:
            u['subscribers']=None

        try:
            u['views']=channel_data['about']['views']
        except:
            u['views']=None

        try:
            u['videos']=channel_data['about']['videos']
        except:
            u['videos']=None

        l.append(u)
        u={}

else:
    print(f"Request failed with status code: {response.status_code}")


print(l)
				
			

Let me explain to you what we have done here.

  • We are using a for loop to iterate over all the channel links we got from the SERP API.
  • Then we are collecting the number of subscribersviews, and videos.

Once you run this code, you will get this JSON data.

				
					[
  {'subscribers': 1400000, 'views': '499,096,888 views', 'videos': '57K videos'}, 
  {'subscribers': 2740000, 'views': '1,139,947,606 views', 'videos': '81K videos'}
]
				
			

Collecting data in a CSV file

For this step, we are going to use Pandas.

				
					df = pd.DataFrame(l)
df.to_csv('channels.csv', index=False, encoding='utf-8')
				
			
  • First, convert the list l (which contains dictionaries of channel data) into a Pandas DataFrame.
  • Then saves the DataFrame as a CSV file named channels.csv.

Once you run the code, you will see a file by the name channels.csv inside your folder.

csv file

Now, you can analyze and reach out to them.

Complete Code

You can certainly extract additional details from the JSON response, but for now, the code will look like this:

				
					import requests
import pandas as pd


api_key = "your-key"
url = "https://api.scrapingdog.com/google"
u={}
l=[]
params = {
    "api_key": api_key,
    "query": "site:youtube.com/@ finance",
    "results": 2,
    "country": "us",
    "page": 0,
    "advance_search": "false",
    "ai_overview": "false"
}

response = requests.get(url, params=params)

if response.status_code == 200:
    data = response.json()
    for i in range(0,len(data['organic_results'])):
        channel_data=requests.get('https://api.scrapingdog.com/youtube/channel?api_key=your-key&channel_id={}'.format(data['organic_results'][i]['link'].replace("https://www.youtube.com/",""))).json()

        try:
            u['channel']=data['organic_results'][i]['link']
        except:
            u['channel']=None

        try:
            u['subscribers']=channel_data['about']['subscribers']
        except:
            u['subscribers']=None

        try:
            u['views']=channel_data['about']['views']
        except:
            u['views']=None

        try:
            u['videos']=channel_data['about']['videos']
        except:
            u['videos']=None

        l.append(u)
        u={}

else:
    print(f"Request failed with status code: {response.status_code}")

df = pd.DataFrame(l)
df.to_csv('channels.csv', index=False, encoding='utf-8')
print(l)
				
			

Conclusion

By combining Scrapingdog’s Google SERP API and YouTube Channel API, you can automate the entire process of discovering niche influencers and evaluating their reach, without the manual hassle. Whether you’re running a finance campaign or targeting another industry, this workflow helps you find high-potential YouTube creators quickly, with real data to back your decisions. It’s fast, scalable, and ideal for modern influencer marketing.

My name is Manthan Koolwal and I am the founder of scrapingdog.com. I love creating scraper and seamless data pipelines.
Manthan Koolwal

Web Scraping with Scrapingdog

Scrape the web without the hassle of getting blocked

Recent Blogs

Scrape Twitter (X)

How to Scrape X (Tweets & Profiles) Using Python

In this read, we are going to scrape X (Tweets & Profiles) using Python. Further, to avoid blockage and to scale X scraping we have used Scrapingdog's Twitter Scraper API.

Web Scraping Statistics & Trends You Need to Know in 2025

Learn how to build a YouTube comment sentiment analyzer using Scrapingdog, Lovable, and the ChatGPT API. Explore data scraping, sentiment analysis, and AI integration.