Build a Price Tracking too with ProxyTee and Python in 2025

Building a price tracking tool is a fantastic way to explore Python automation, especially for beginners. The script you create can be put to practical use immediately and scaled across multiple e-commerce sites. In this post, we’ll also show how ProxyTee can enhance your web scraping capabilities to make the entire process smoother and more reliable.
What is a price tracker?
A price tracker is an automated program that regularly scrapes product prices from e-commerce sites and logs any changes. Beyond basic web scraping, it can include features like sending email alerts when prices fall below a specific threshold. Simple trackers can be used for personal tasks, while more sophisticated ones track prices for millions of products.
Why Monitor Prices?
There are many advantages to price monitoring. For individual users, it’s an opportunity to buy items at the lowest price possible. Businesses, on the other hand, monitor competitor prices to gain insights into sales strategies, enabling them to position their own products at optimal profit margins. Consistent price monitoring allows for making data-backed pricing decisions.
Building the Price Monitoring Script
This section showcases a Python script for price tracking across multiple products. It uses web scraping to extract product data and automated emails to send alerts on price changes. ProxyTee‘s services can be invaluable when implementing a price tracking system of this sort.
Project Requirements
The following script is designed for Python 3.6 and above and utilizes these Python libraries:
- Requests: To send HTTP requests and download web pages without a browser. This library is critical for the price monitoring script.
- BeautifulSoup: To parse HTML, making it easier to query for specific elements.
- lxml: For parsing the HTML. While we’ll use BeautifulSoup, lxml is a fundamental parser behind the scenes.
- price-parser: To extract the price from a string that includes additional text and symbols.
- smtplib: To handle sending email notifications.
- Pandas: For filtering product data, reading from CSV files, and exporting.
Here is how to set up a virtual environment to keep your workspace organized:
$ python3 -m venv .venv
$ source .venv/bin/activate
Install all of the necessary dependencies using pip:
$ pip install pandas requests beautifulsoup4 price-parser lxml
Now create a new Python file, and add the imports:
import smtplib
import pandas as pd
import requests
from bs4 import BeautifulSoup
from price_parser import Price
Add these initial configurations:
PRODUCT_URL_CSV = "products.csv"
SAVE_TO_CSV = True
PRICES_CSV = "prices.csv"
SEND_MAIL = True
The CSV file for product URLs is set using PRODUCT_URL_CSV. The SAVE_TO_CSV flag controls saving prices to CSV using the PRICES_CSV, and SEND_MAIL can be set to True to send out email alerts.
Reading a List of Product URLs
For this example, we’ll store product URLs in a CSV, which can be easily managed using text or spreadsheet applications. The CSV should have the fields URL and alert_price. When product prices drop below the set alert_price an email will be sent. Data is read via Pandas with the following function:
def get_urls(csv_file):
df = pd.read_csv(csv_file)
return df
This will return a Pandas DataFrame including the product, its URL, and an alert price.
Scraping Prices
Loop through product URLs and use the to_dict() method to get a list of dictionaries for easier processing.
def process_products(df):
for product in df.to_dict("records"):
#product["url"] is the URL
We’ll create two helper functions here: one to retrieve the HTML for each product URL and another one to extract prices from that HTML.
Fetching HTML:
def get_response(url):
response = requests.get(url)
return response.text
Extracting Prices:
def get_price(html):
soup = BeautifulSoup(html, "lxml")
el = soup.select_one(".price_color")
price = Price.fromstring(el.text)
return price.amount_float
This function locates the price using a CSS selector and returns it as a float.
Next loop through all rows to include updated price data, appending them to updated_products
def process_products(df):
updated_products = []
for product in df.to_dict("records"):
html = get_response(product["url"])
product["price"] = get_price(html)
product["alert"] = product["price"] < product["alert_price"]
updated_products.append(product)
return pd.DataFrame(updated_products)
The above returns an updated DataFrame object, with the extracted prices and alert flag.
Saving the Output:
The final DataFrame can be saved as a CSV file:
if SAVE_TO_CSV:
df_updated.to_csv(PRICES_CSV, mode="a")
By setting the mode to “a”, new data will append to any previous results in the file.
Sending Email Alerts
If an email alert has been triggered, use this method to filter the updated data and return the subject/body.
def get_mail(df):
subject = "Price Drop Alert"
body = df[df["alert"]].to_string()
subject_and_message = f"Subject:{subject}\n\n{body}"
return subject_and_message
Next is the function using `smtplib` to send emails.
def send_mail(df):
message_text = get_mail(df)
with smtplib.SMTP("smtp.server.address", 587) as smtp:
smtp.starttls()
smtp.login(mail_user, mail_pass)
smtp.sendmail(mail_user, mail_to, message_text)
This assumes the `mail_user`, `mail_pass` and `mail_to` variables have been defined previously.
Here’s how to bring everything together:
def main():
df = get_urls(PRODUCT_URL_CSV)
df_updated = process_products(df)
if SAVE_TO_CSV:
df_updated.to_csv(PRICES_CSV, index=False, mode="a")
if SEND_MAIL:
send_mail(df_updated)
if __name__ == "__main__":
main()
Run the code on a schedule using cron or Task Scheduler or by using a cloud service.
Putting It All Together
This Python price tracking script will fetch product prices from any number of URLs, and has flexible methods for delivering output data and notifications. For instance, you could easily export data into a database. As price tracking often requires residential proxies for success, consider how ProxyTee’s rotating residential proxies, with their unlimited bandwidth and geo-targeting features, make them an ideal fit for this. For further details about features, such as auto-rotation or the simple API, visit their official website.
When engaging in web scraping, it’s crucial to use a proxy service such as ProxyTee to avoid IP blocks and bans. ProxyTee provides an affordable, reliable, and easy-to-use solution for anyone needing rotating residential proxies. Its features—such as unlimited bandwidth, a global IP pool, protocol flexibility, auto-rotation, and API integration—make it a great option for businesses and individuals involved in tasks like web scraping, streaming, or data gathering. With a focus on user-friendly design and competitive pricing, ProxyTee delivers strong value for those looking for effective proxy services. ProxyTee offers various proxy products include: Static Residential Proxies, Mobile Proxies, and Datacenter Proxies.