Download Youtube comments with Python script and Youtube API

Guide to Accessing YouTube Comments: Essential Tools and Python Scripts for Both Coders and Non-Coders

YouTube Comment Extraction Using a Python Script #

For those comfortable with Python and installing a Python interpreter, we can discuss a short script enabling you to extract YouTube comments and replies through the YouTube API.

Register for API Access #

To start, you'll need to acquire an API Key. Follow these steps:

  1. Register as a developer here: https://developers.google.com/
  2. Create a new project in the Developer's Console: https://console.developers.google.com/cloud-resource-manager
  3. Enable the YouTube Data API v3 for your project in the APIs Library: https://console.developers.google.com/apis/library
  4. Generate a new API Key in the Credentials section of your project.
  5. API documentation: https://developers.google.com/youtube/v3/docs/commentThreads/list

Python script to download Youtube comments:

pip install google-api-python-client
import apiclient

API_KEY = 'ENTER YOUR API KEY HERE'


def extract_comment_data(snippet):
return {'author': snippet['authorDisplayName'], 'text': snippet['textDisplay']}


def fetch_comments(comments_list, vid_id, next_token=None):

youtube_service = apiclient.discovery.build('youtube', 'v3', developerKey=API_KEY)

if next_token:
response = youtube_service.commentThreads().list(
part="snippet,replies",
videoId=vid_id,
pageToken=next_token,
textFormat="plainText",
maxResults=100
).execute()
else:
response = youtube_service.commentThreads().list(
part="snippet,replies",
videoId=vid_id,
textFormat="plainText",
maxResults=100
).execute()

for entry in response['items']:
top_comment = extract_comment_data(entry['snippet']['topLevelComment']['snippet'])
comments_list.append(top_comment)
if 'replies' in entry.keys():
for reply in entry['replies']['comments']:
reply_comment = extract_comment_data(reply['snippet'])
comments_list.append(reply_comment)

if 'nextPageToken' in response:
comments_list.extend(fetch_comments(comments_list, vid_id, response['nextPageToken']))

return comments_list


if __name__ == '__main__':
# put your main function here

Step-by-step explanation of this script:

  1. Import apiclient: The script begins by importing the apiclient library, which is part of Google's client libraries for interacting with Google APIs.

  2. API_KEY: This variable holds the API key you receive from Google API console. This key is used for authenticating your application for using Google's API.

  3. The extract_comment_data function: This function is used to extract the author's name and the content of the comment from a comment snippet. The function takes as input a snippet (a dictionary), which contains information about a comment. It returns a new dictionary with the author's name and the comment text.

  4. The fetch_comments function: This function fetches YouTube comments for a specific video using the YouTube Data API.

    • This function takes three arguments: a list to store the comments, a video ID, and an optional page token for pagination.

    • It first creates a YouTube service object using the apiclient.discovery.build function and the provided API key.

    • If a page token is provided, it includes it in the request to the API. This is used to get the next page of results if there are more comments than can be returned in a single response (YouTube API has a limit on the number of results it returns in one call).

    • The results from the API call are processed in a for loop. For each item in the result, it creates a comment dictionary using the extract_comment_data function and appends it to the comments list. If there are replies to the comment, it processes them in the same way and adds them to the comments list.

    • If there are more pages of comments (indicated by the presence of 'nextPageToken' in the results), it calls itself recursively with the next page token to fetch the remaining comments.

  5. The main function: The script concludes with a Python "if name == 'main':" statement, which allows or prevents parts of code from being run when the modules are imported. If you want to use this script, you would add your main function here. This would typically involve calling the fetch_comments function with a specific video ID and an empty list to store the comments.

No-code tools for scraping YouTube comments #

While creating your script allows for a flexible data extraction process, leveraging a third-party YouTube comment scraper. In this case, using tools to access YouTube comments may save you time.

There are various third-party resources available that can help you extract data from YouTube efficiently. Some of these tools are incredibly user-friendly, requiring no programming knowledge, while others may require a basic understanding of programming for setup and usage.

  • YouTube Comment Extractor: A simple-to-use online tool for scraping YouTube comments (including responses) from a particular video and storing them in an Excel spreadsheet. Provide it with the video link, set a few parameters via the web interface, and hit "Start."

  • YouTube Comments Scraper: This web-based data scraping application utilizes Node.js, Express.js, React.js, Puppeteer, and Socket.IO technologies. The app interacts with the Chromium headless browser (via Puppeteer library) without making any requests to the official YouTube scraper API. It should be noted that the app operates on a free Heroku server, which could lead to slower response times.

  • YouTube-Comment-Scraper-CLI: This is a Node.js application offering a command-line interface. It accepts a video ID as a required argument and provides output in CSV or JSON formats or as stdout stream data. It handles replies identically to standard comments.

  • ExportComments: This online tool extracts YouTube, Instagram, and Twitter comments. It's easy to use - you need to provide the YouTube video URL, and it will extract comments into a CSV or Excel file. The free version has a limit on the number of comments you can extract.

  • SerpApi: SerpApi is a paid service that provides a simple way to scrape YouTube comments. It handles proxies, browsers, and CAPTCHAs so that you can get the HTML from any web page with a simple API call.

  • Octoparse: Octoparse is a robust web scraping tool that can scrape data from websites like YouTube. It offers both a free plan and advanced paid options. It has a user-friendly interface that guides you through setting up your scrape.

  • ParseHub: ParseHub is another powerful web scraping tool that can handle data extraction from YouTube. It offers a free version with limited features and paid plans for more advanced needs.

  • ScrapeStorm: ScrapeStorm is an AI-powered visual web scraping tool that can extract data from almost any website without writing code. It is powerful and easy to use. It offers a free version with limited features and paid plans for more advanced needs.

What about downloading youtube videos?
If you're also interested in learning how to download YouTube videos using Python, check out our related article: Download YouTube Videos with Python.

Published