Lecteur d'URL avec Wget

Récupère le contenu des URLs en utilisant wget. Permet de télécharger des fichiers, lire des pages web et accéder aux réponses API.

Spar Skills Guide Bot
DeveloppementDébutant0 vues0 installations08/03/2026
Claude CodeCursorWindsurfCopilot
url-fetchingwgetweb-scrapingapi-clientfile-download

name: wget-reader description: Fetch data from URLs. Use when asked to download content, fetch remote files, or read web data. version: 1.0.0

Wget URL Reader

Overview

Fetches content from URLs using wget command-line tool. Supports downloading files, reading web pages, and retrieving API responses.

Instructions

  1. When user provides a URL to read or fetch:

    • Validate the URL format
    • Use wget with appropriate flags based on content type
  2. For reading content to stdout (display):

    wget -qO- "<URL>"
    
  3. For downloading files:

    wget -O "<filename>" "<URL>"
    
  4. For JSON API responses:

    wget -qO- --header="Accept: application/json" "<URL>"
    
  5. Common wget flags:

    • -q: Quiet mode (no progress output)
    • -O-: Output to stdout
    • -O <file>: Output to specific file
    • --header: Add custom HTTP header
    • --timeout=<seconds>: Set timeout
    • --tries=<n>: Number of retries
    • --user-agent=<agent>: Set user agent

Examples

Example: Read webpage content

Input: "Read the content from https://example.com" Command:

wget -qO- "https://example.com"

Example: Download a file

Input: "Download the file from https://example.com/data.json" Command:

wget -O "data.json" "https://example.com/data.json"

Example: Fetch API with headers

Input: "Fetch JSON from https://api.example.com/data" Command:

wget -qO- --header="Accept: application/json" "https://api.example.com/data"

Example: Download with timeout and retries

Input: "Download with 30 second timeout" Command:

wget --timeout=30 --tries=3 -O "output.txt" "<URL>"

Guidelines

  • Always quote URLs to handle special characters
  • Use -q flag to suppress progress bars in scripts
  • For large files, consider adding --show-progress for user feedback
  • Respect robots.txt and rate limits when fetching multiple URLs
  • Use --no-check-certificate only when necessary (self-signed certs)
  • For authentication, use --user and --password or --header="Authorization: Bearer <token>"

Skills similaires