SimpleCrawl
use caseprice monitoringe-commercecompetitive intelligence

Price Monitoring API — Track Competitor Prices Automatically

Use SimpleCrawl's API to build automated price monitoring systems. Extract product prices, detect changes, and get alerts — all through a simple API.

SimpleCrawl Team6 min read

A price monitoring API lets you track competitor prices, detect changes, and make data-driven pricing decisions without manually checking hundreds of product pages. SimpleCrawl's structured extraction pulls product names, prices, availability, and more from any e-commerce site — no custom selectors required.

Why Price Monitoring Matters

Manual price checking does not scale. A single product with 10 competitors across 5 marketplaces means monitoring 50 pages. At 100 products, that is 5,000 page checks. Daily.

Businesses that automate price monitoring:

  • React to competitor price changes within hours, not days
  • Identify pricing trends before they affect revenue
  • Maintain competitive positioning across all channels
  • Track MAP (Minimum Advertised Price) violations

How SimpleCrawl Handles Price Extraction

SimpleCrawl's schema-based extraction returns structured pricing data without writing CSS selectors:

curl -X POST https://api.simplecrawl.com/scrape \
  -H "Authorization: Bearer YOUR_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "url": "https://store.example.com/product/widget-pro",
    "output": "json",
    "schema": {
      "product_name": "string",
      "price": "number",
      "currency": "string",
      "original_price": "number",
      "discount_percentage": "number",
      "in_stock": "boolean",
      "seller": "string",
      "last_updated": "string"
    }
  }'

Response:

{
  "data": {
    "product_name": "Widget Pro 2026",
    "price": 49.99,
    "currency": "USD",
    "original_price": 79.99,
    "discount_percentage": 37,
    "in_stock": true,
    "seller": "Example Store",
    "last_updated": "2026-03-01"
  },
  "url": "https://store.example.com/product/widget-pro",
  "scraped_at": "2026-03-01T14:30:00Z"
}

No selectors to maintain. When the site redesigns, the extraction still works because SimpleCrawl understands the content semantically.

Building a Price Monitoring System

Architecture

┌─────────────┐    ┌──────────────┐    ┌──────────────┐
│  Scheduler   │───▶│  SimpleCrawl  │───▶│   Database    │
│  (cron job)  │    │  Batch API    │    │  (Postgres)   │
└─────────────┘    └──────────────┘    └──────────────┘
                                              │
                                        ┌─────▼──────┐
                                        │  Alerting   │
                                        │  (email/    │
                                        │   Slack)    │
                                        └────────────┘

Step 1: Define Products to Monitor

products = [
    {
        "name": "Widget Pro",
        "urls": [
            "https://store-a.com/widget-pro",
            "https://store-b.com/products/widget-pro",
            "https://amazon.com/dp/B0EXAMPLE",
        ]
    },
    {
        "name": "Widget Lite",
        "urls": [
            "https://store-a.com/widget-lite",
            "https://store-b.com/products/widget-lite",
        ]
    },
]

Step 2: Extract Prices

import simplecrawl
from datetime import datetime

client = simplecrawl.Client(api_key="YOUR_KEY")

price_schema = {
    "product_name": "string",
    "price": "number",
    "currency": "string",
    "in_stock": "boolean",
}

def extract_prices(product):
    results = []
    for url in product["urls"]:
        try:
            result = client.scrape(url, output="json", schema=price_schema)
            results.append({
                "product": product["name"],
                "url": url,
                "price": result.data["price"],
                "currency": result.data.get("currency", "USD"),
                "in_stock": result.data.get("in_stock", True),
                "scraped_at": datetime.utcnow().isoformat(),
            })
        except Exception as e:
            results.append({
                "product": product["name"],
                "url": url,
                "error": str(e),
                "scraped_at": datetime.utcnow().isoformat(),
            })
    return results

Step 3: Detect Changes and Alert

import smtplib
from email.message import EmailMessage

def check_price_changes(current_prices, previous_prices, threshold=0.05):
    alerts = []
    for curr in current_prices:
        if "error" in curr:
            continue
        prev = next(
            (p for p in previous_prices if p["url"] == curr["url"]),
            None
        )
        if not prev or "error" in prev:
            continue

        change = (curr["price"] - prev["price"]) / prev["price"]
        if abs(change) >= threshold:
            alerts.append({
                "product": curr["product"],
                "url": curr["url"],
                "old_price": prev["price"],
                "new_price": curr["price"],
                "change_pct": round(change * 100, 1),
            })
    return alerts

def send_alert(alerts):
    if not alerts:
        return
    body = "Price changes detected:\n\n"
    for a in alerts:
        direction = "increased" if a["change_pct"] > 0 else "decreased"
        body += (
            f"• {a['product']} — {direction} {abs(a['change_pct'])}% "
            f"(${a['old_price']} → ${a['new_price']})\n"
            f"  {a['url']}\n\n"
        )
    msg = EmailMessage()
    msg.set_content(body)
    msg["Subject"] = f"Price Alert: {len(alerts)} change(s) detected"
    msg["From"] = "alerts@yourdomain.com"
    msg["To"] = "team@yourdomain.com"
    # Send via your SMTP service

Step 4: Schedule with Cron

# Check prices every 6 hours
0 */6 * * * cd /path/to/project && python monitor_prices.py

Or use SimpleCrawl's webhook for async batch processing:

client.batch(
    urls=[url for p in products for url in p["urls"]],
    output="json",
    schema=price_schema,
    webhook_url="https://your-api.com/prices/callback"
)

Handling E-Commerce Challenges

Dynamic Pricing

Many e-commerce sites change prices based on time, location, and browsing history. SimpleCrawl's managed proxy network ensures consistent prices by:

  • Rotating IP addresses to avoid personalized pricing
  • Using neutral browser fingerprints
  • Not carrying cookies between requests

JavaScript-Rendered Prices

Sites like Amazon and Walmart load prices dynamically via JavaScript. SimpleCrawl's built-in browser rendering captures the final rendered price — no need to configure wait conditions or interaction scripts.

Anti-Bot Protection

E-commerce sites invest heavily in bot detection. SimpleCrawl's advanced anti-bot bypass handles Cloudflare, DataDome, and PerimeterX — the protections used by most major retailers. See our comparison guide for bypass success rates.

Variant Pricing

Products with multiple variants (sizes, colors) often show different prices. Use structured extraction with array schemas:

result = client.scrape(product_url, output="json", schema={
    "product_name": "string",
    "variants": [{
        "name": "string",
        "price": "number",
        "in_stock": "boolean"
    }]
})

Cost Analysis

Monitoring scopeCheck frequencyPages/monthSimpleCrawl planMonthly cost
50 products, 3 storesEvery 6 hours18,000Growth ($79)$79
200 products, 5 storesDaily30,000Scale ($199)$199
500 products, 5 storesEvery 6 hours360,000EnterpriseCustom
1,000 products, 3 storesDaily90,000Scale ($199)$199

Compare this to manual monitoring (analyst time), building your own scraping infrastructure (engineering time + proxy costs), or enterprise price intelligence platforms ($500–$5,000+/month).

FAQ

What is a price monitoring API?

A price monitoring API automates the process of checking product prices across websites. Instead of manually visiting competitor pages, you send URLs to the API and receive structured pricing data back. Combined with scheduling and alerting, it creates an automated competitive intelligence system.

Scraping publicly available product prices is generally legal. Prices displayed on public e-commerce pages are intended for consumers to see. However, avoid scraping behind login walls, violating Terms of Service where enforceable, or collecting personal data. Always check robots.txt and consult legal counsel for your jurisdiction.

How often should I check prices?

It depends on your market. Fast-moving categories (electronics, travel) benefit from hourly or every-6-hour checks. Stable categories (furniture, industrial) can use daily or weekly monitoring. Start with daily and adjust based on how frequently prices actually change.

Can SimpleCrawl monitor prices on Amazon?

Yes. SimpleCrawl's JavaScript rendering and anti-bot bypass handle Amazon product pages. Use structured extraction to pull prices, availability, seller information, and ratings without writing Amazon-specific parsing code.

How is this different from enterprise price intelligence tools?

Enterprise tools (Prisync, Competera, Intelligence Node) offer dashboards, analytics, and repricing recommendations. They are full platforms. SimpleCrawl is the data extraction layer — you get raw pricing data and build your own analytics. SimpleCrawl costs $29–199/month vs $500–5,000/month for enterprise tools.

Can I combine price monitoring with other use cases?

Yes. Teams commonly combine price monitoring with lead generation (scraping product catalogs for business development) and content aggregation (monitoring competitor blog content alongside pricing).

Get Started

Build your price monitoring system in an afternoon. Join the SimpleCrawl waitlist for 500 free credits — enough to set up monitoring for 100+ products across multiple stores.

Ready to try SimpleCrawl?

We're building the simplest web scraping API for AI. Join the waitlist and get 500 free credits at launch.

Get early access + 500 free credits