Automate Stock Price Alerts to Slack with Make & ScrapingBee: A Web Scraping Guide

As an investor, few things are as important as staying on top of movements in your portfolio. According to a 2022 Gallup poll, 58% of Americans own stock – but how many are truly keeping tabs on their investments day-to-day?

Manually checking stock tickers is tedious and time-consuming. As a web scraping expert, I‘m a big proponent of automating data extraction tasks like this. Not only does it save time, but it also ensures you never miss an important update.

In this guide, we‘ll walk through how to build an automated workflow that scrapes live stock prices from Yahoo Finance and sends formatted alerts to Slack. We‘ll use the no-code platform Make (formerly Integromat) and the ScrapingBee web scraping API.

Why Scrape Yahoo Finance for Stock Data?

Yahoo Finance is one of the most comprehensive free sources for financial data. You get real-time pricing, historical data, key statistics, news, and analysis for a wide universe of assets including stocks, bonds, commodities, and currencies.

For our use case of tracking stock prices, Yahoo Finance pages have all the key data points we want:

  • Current price
  • $ change
  • % change
  • 52-week high/low
  • Market cap
  • Volume
  • And more

Yahoo Finance stock data

Scraping this data substantially enhances the level of detail we can include in our Slack alerts. We could monitor things like trading volume spikes or if a stock is approaching 52-week highs/lows.

Legal & Ethical Considerations for Web Scraping

Before we dive into the technical how-to, a quick note on the legal and ethical implications of web scraping.

While scraping publicly available data is generally permissible, there are some important best practices to follow:

  • Respect the website‘s terms of service and robots.txt
  • Don‘t overload servers with excessive requests
  • Only scrape what‘s truly publicly available – don‘t circumvent logins or firewalls
  • Use the scraped data for legitimate purposes

In Yahoo Finance‘s case, their terms of service does not explicitly prohibit scraping. They do have some rate limits which we‘ll be mindful of by limiting our scraping frequency.

Setting Up ScrapingBee

With those disclaimers out of the way, let‘s set up our scraper in ScrapingBee.

First, sign up for a free ScrapingBee account and make note of your API key. We‘ll input this in Make later to connect the two platforms.

ScrapingBee API key

To construct our scraper, we need to identify the CSS selectors of the data points we want to extract from the Yahoo Finance page. Using Chrome DevTools, we can inspect the page elements:

Inspecting Yahoo Finance elements

For this example, we‘ll scrape:

DataCSS Selector
Stock name#quote-header-info h1
Last pricefin-streamer[data-field="regularMarketPrice"]
Change ($)fin-streamer[data-field="regularMarketChange"]
Change (%)fin-streamer[data-field="regularMarketChangePercent"]

These CSS selectors will form our data extraction rules in ScrapingBee. Extraction rules are a key feature that tell the scraper exactly what elements to target and return from a page.

Automating with Make

Make is a powerful no-code platform for building and automating workflows. We‘ll use it to schedule our scraper to run on a set interval and pipe the extracted data to Slack.

Connect ScrapingBee & Slack Accounts

In your Make dashboard, create a new scenario and add the following modules:

  1. ScrapingBee > Make an API call
  2. Slack > Send a Channel Message

Make modules

First connect your ScrapingBee account using your API key. Then connect Slack and authenticate your account.

Configure ScrapingBee API Request

In the ScrapingBee module settings, fill in:

{
  "stock": "#quote-header-info h1",
  "price": "fin-streamer[data-field=‘regularMarketPrice‘]", 
  "change": "fin-streamer[data-field=‘regularMarketChange‘]",
  "changePercent": "fin-streamer[data-field=‘regularMarketChangePercent‘]"
}

You can leave the other settings at their defaults. The free ScrapingBee plan includes 1,000 monthly credits which is plenty for this use case.

Click "Run Once" and you should see the scraped data returned in the output:

ScrapingBee output in Make

Format the Slack Message

Next, set up the Slack module to receive the scraped data and post it to a channel.

In the Message Text field, you can reference the ScrapingBee data using Make‘s curly brace syntax. For example:

*TSLA Stock Price Alert*
> Price: ${{price}}  
> Change: {{change}} ({{changePercent}})

This will translate to a nicely formatted message in Slack:

Slack stock price alert

You can further customize the message formatting using Slack‘s Block Kit to add elements like buttons, images, and dividers.

Schedule the Scenario

Lastly, configure the scenario to run on a schedule to get automated price alerts.

Under the scenario‘s Settings > Scheduling, choose an interval (e.g. every weekday at 9am) and turn Scheduling to ON.

Make scenario scheduling

That‘s it! The scenario will now run automatically to scrape current stock data and post an update to Slack on your set frequency.

Advanced Strategies for Customization and Scaling

This tutorial covers the basics of automated stock price tracking, but there are many ways you could enhance this workflow:

  • Scrape a watchlist of multiple stocks and send batch updates
  • Incorporate additional data like trading volume, market cap, P/E ratio, etc.
  • Set up logic to conditionally alert on price/percent movement thresholds
  • Generate visualizations and charts from historical price data
  • Output to other destinations like email digests or SMS

As a web scraping professional, I frequently work with clients to build bespoke data extraction pipelines to meet their unique needs. With tools like Make and ScrapingBee, it‘s easier than ever to automate scraping financial web data for investing research, algorithmic trading, portfolio monitoring, and more.

The key is striking the right balance between utility and ethics. Be selective in what you scrape, respect website terms, and always have a clear use case for the data.

Hopefully this guide gives you a foundation to start automating your own investment tracking. The time and headaches it will save you is well worth the initial setup. Happy scraping!

Did you like this post?

Click on a star to rate it!

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this post.