Practical case: MJPEG on Raspberry Pi Zero W CSI camera

Practical case: MJPEG on Raspberry Pi Zero W CSI camera — hero

Objective and use case

What you’ll build: A minimal-latency MJPEG-over-HTTP web server that streams real-time video from the CSI-connected Raspberry Pi Camera Module v2, hosted on a Raspberry Pi Zero W.

Why it matters / Use cases

  • Real-time surveillance: Stream live video feeds for security applications using Raspberry Pi Zero W and Camera Module v2.
  • Remote monitoring: Enable remote access to camera feeds for environmental monitoring or wildlife observation.
  • Interactive projects: Use the MJPEG stream in DIY robotics or IoT applications for real-time video feedback.
  • Low-latency video conferencing: Implement a lightweight video server for personal or small group video calls.

Expected outcome

  • Stream latency under 200ms for real-time applications.
  • Frame rate of 30 FPS for smooth video playback.
  • Web server response time of less than 100ms for MJPEG requests.
  • Support for multiple concurrent clients without significant degradation in performance.

Audience: Advanced users; Level: Advanced

Architecture/flow: Raspberry Pi Zero W with Camera Module v2 captures video, processes it with Python, and streams via HTTP using MJPEG format.

Advanced Practical Case: MJPEG Web Server Streaming from CSI on Raspberry Pi Zero W + Camera Module v2

Objective: Build a minimal-latency, MJPEG-over-HTTP web server that streams real-time video from the CSI-connected Raspberry Pi Camera Module v2, hosted directly on a Raspberry Pi Zero W. The server will expose a web page and a raw MJPEG endpoint suitable for browsers and clients like VLC or ffmpeg.

Note on OS/arch: Raspberry Pi Zero W uses an ARMv6 CPU and does not support a 64-bit OS. We therefore use Raspberry Pi OS Bookworm (32-bit, “Lite” recommended) with Python 3.11, which matches the default Python version on Bookworm. All commands below explicitly target Bookworm and Python 3.11.


Prerequisites

  • Experience level: Advanced (comfortable with Linux shell, systemd, Python, and networking).
  • A fresh Raspberry Pi OS Bookworm Lite (32-bit) image flashed to a microSD.
  • Reliable 5V power for the Pi Zero W.
  • Basic familiarity with libcamera/Picamera2 stack introduced in Bullseye/Bookworm.
  • Network access to your Raspberry Pi Zero W (Wi‑Fi 2.4 GHz).
  • The correct camera ribbon for Pi Zero (22-pin on the board; the Camera Module v2 uses 15-pin; you need a 15-to-22 Pi Zero camera cable).

Why Bookworm 32-bit? The Pi Zero W is not 64-bit capable; Bookworm 32-bit still ships Python 3.11 and the modern libcamera/Picamera2 stack required for CSI camera use.


Materials

Item Exact Model Notes
Single-board computer Raspberry Pi Zero W (v1.1) Built-in 2.4 GHz Wi‑Fi, 512 MB RAM
Camera Raspberry Pi Camera Module v2 (Sony IMX219) 8 MP CSI camera
Camera cable Raspberry Pi Zero camera cable (15-pin ↔ 22-pin) Required to connect Camera Module v2 to the Zero W
microSD 16 GB or larger, Class 10 Preflash Raspberry Pi OS Bookworm Lite (32-bit)
Power 5V, 2.5 A micro‑USB PSU Stable power is critical during camera usage
Optional Heatsink for Zero W Helps under continuous video capture
Optional USB‑TTL console adapter For headless recovery/troubleshooting

Setup/Connection

1) Flash OS and boot headless (recommended)

  • Use Raspberry Pi Imager:
  • OS: Raspberry Pi OS Lite (32-bit) – Bookworm.
  • Set hostname: rpi-zero.
  • Enable SSH and set username/password (or SSH key).
  • Set Wi‑Fi country, SSID, and passphrase.
  • Insert the microSD into the Pi Zero W.

2) Mount the Camera Module v2 to the Zero W

  • Power off the Pi.
  • Use the Pi Zero camera cable (22-pin side to the Pi Zero W CSI connector; 15-pin side to the Camera Module v2).
  • Orientation:
  • On the Pi Zero W: the blue stiffener faces away from the PCB (toward the board edge), with contacts facing the PCB.
  • On the Camera Module v2: the blue stiffener typically faces the camera PCB; verify contacts align with the connector contacts.
  • Ensure the cable is fully seated and the latch is closed on both sides.

3) First boot and basic checks

  • Power on and SSH in:
  • Find the Pi: ping rpi-zero.local or arp -a or your router’s DHCP list.
  • SSH: ssh pi@rpi-zero.local.
  • Confirm Python and OS:
  • python3 --version should show Python 3.11.x on Bookworm.
  • cat /etc/os-release should show Debian 12 (Bookworm).

Enabling Required Interfaces

With Bookworm, the libcamera stack is the default and does not require the legacy camera stack. Ensure the system has sufficient GPU memory and camera overlays active.

Option A: Using raspi-config

  • Run:
    sudo raspi-config
  • Do NOT enable “Legacy Camera.” We’ll stay with the default libcamera stack.
  • Increase GPU memory:
  • Performance Options -> GPU Memory -> 128
  • Finish and reboot:
    sudo reboot

Option B: Editing /boot/firmware/config.txt directly

  • Edit config:
    sudo nano /boot/firmware/config.txt
  • Add or adjust the following lines at the end:
    gpu_mem=128
    dtoverlay=imx219
  • Save and reboot:
    sudo reboot

System Update and Package Installation

Update and install camera tools and Python packages.

sudo apt update
sudo apt full-upgrade -y
sudo apt install -y \
  libcamera-apps \
  python3-picamera2 \
  python3-libcamera \
  python3-venv \
  python3-pip \
  python3-gpiozero \
  python3-smbus \
  python3-spidev \
  git

Check camera access (should open a preview for a few seconds; no monitor is needed for the health check):

libcamera-hello -t 3000

Take a quick still to verify the camera:

libcamera-jpeg -o test.jpg
ls -lh test.jpg

Python Virtual Environment

We will create a venv that can still access system-installed Picamera2 (via apt) by enabling system site packages.

# Project directory
mkdir -p ~/mjpeg-streamer
cd ~/mjpeg-streamer

# Create venv with system site packages so python3-picamera2 is available
python3 -m venv --system-site-packages .venv
source .venv/bin/activate

# Verify Python is 3.11
python -V

# Install web framework and utilities via pip
pip install --upgrade pip wheel
pip install aiohttp pillow

Confirm Picamera2 import and versions:

python - << 'PY'
import sys, pkgutil
print("Python:", sys.version)
print("Has picamera2:", pkgutil.find_loader("picamera2") is not None)
try:
    from picamera2 import Picamera2
    print("Picamera2 import OK")
except Exception as e:
    print("Picamera2 import failed:", e)
PY

Full Code

Create the MJPEG web server using Picamera2 for capture and aiohttp for HTTP streaming. This implementation:
– Configures the CSI camera for a low-latency 640×480 stream at 10–15 FPS (tunable).
– Encodes frames to JPEG using Pillow.
– Serves an HTML page and a raw MJPEG endpoint.
– Avoids heavy dependencies like OpenCV to keep the footprint appropriate for the Pi Zero W.
– Minimizes per-client overhead by streaming the latest frame.

Create app.py in ~/mjpeg-streamer:

#!/usr/bin/env python3
import argparse
import io
import signal
import sys
import threading
import time
from datetime import datetime

from aiohttp import web
from PIL import Image
import numpy as np

from picamera2 import Picamera2


BOUNDARY = b"--FRAME"
CRLF = b"\r\n"

class FrameProducer(threading.Thread):
    def __init__(self, width, height, fps, awb="auto", ae="auto"):
        super().__init__(daemon=True)
        self.width = width
        self.height = height
        self.target_period = 1.0 / float(fps)
        self._stop = threading.Event()
        self.last_jpeg = None
        self.last_timestamp = 0.0
        self.lock = threading.Lock()

        self.picam2 = Picamera2()
        camera_config = self.picam2.create_video_configuration(
            main={"size": (self.width, self.height), "format": "RGB888"},
            controls={"FrameRate": int(1.0 / self.target_period)},
        )
        self.picam2.configure(camera_config)

        # If desired, you can tweak AE/AWB here; defaults are fine for most use.
        controls = {}
        if awb != "auto":
            controls["AwbMode"] = awb
        if ae != "auto":
            controls["AeEnable"] = (ae.lower() != "off")
        if controls:
            self.picam2.set_controls(controls)

    def stop(self):
        self._stop.set()

    def run(self):
        self.picam2.start()
        try:
            next_time = time.time()
            while not self._stop.is_set():
                t0 = time.time()
                # Capture a frame as a numpy array (H, W, 3) RGB888
                frame = self.picam2.capture_array("main")
                # Encode to JPEG using Pillow
                jpeg = self._encode_jpeg(frame)
                with self.lock:
                    self.last_jpeg = jpeg
                    self.last_timestamp = t0

                # Try to maintain target FPS
                next_time += self.target_period
                delay = next_time - time.time()
                if delay > 0:
                    time.sleep(delay)
                else:
                    # Running behind; reset schedule
                    next_time = time.time()
        finally:
            self.picam2.stop()

    def snapshot(self):
        # Return the latest JPEG and timestamp
        with self.lock:
            return self.last_jpeg, self.last_timestamp

    @staticmethod
    def _encode_jpeg(frame_np, quality=80):
        # Pillow expects RGB in HxWx3 uint8; frame_np is already RGB888
        img = Image.fromarray(frame_np)
        with io.BytesIO() as buf:
            img.save(buf, format="JPEG", quality=quality, optimize=True)
            return buf.getvalue()


def build_app(producer: FrameProducer, host, port):
    routes = web.RouteTableDef()

    @routes.get("/")
    async def index(_):
        # Minimal HTML page that embeds the stream
        html = f"""
<!doctype html>
<html lang="en">
<head>
  <meta charset="utf-8">
  <title>Pi Zero W MJPEG Stream</title>
  <style>body {{ background: #111; color: #eee; font-family: sans-serif; }}</style>
</head>
<body>
  <h1>Raspberry Pi Zero W + Camera Module v2 — MJPEG Stream</h1>
  <p>Server: http://{host}:{port}</p>
  <img src="/stream.mjpg" style="max-width: 100%; border: 1px solid #444;" />
  <p><a href="/healthz">Health</a></p>
</body>
</html>
        """.strip()
        return web.Response(text=html, content_type="text/html")

    @routes.get("/healthz")
    async def health(_):
        jpeg, ts = producer.snapshot()
        payload = {
            "ok": jpeg is not None,
            "last_frame_ts": ts,
            "now": time.time(),
            "started": True,
        }
        return web.json_response(payload)

    @routes.get("/stream.mjpg")
    async def stream(_request):
        # Multipart MJPEG streaming
        headers = {
            "Age": "0",
            "Cache-Control": "no-cache, private",
            "Pragma": "no-cache",
            "Content-Type": "multipart/x-mixed-replace; boundary=FRAME",
            # CORS optional:
            "Access-Control-Allow-Origin": "*",
        }
        resp = web.StreamResponse(status=200, reason="OK", headers=headers)
        await resp.prepare(_request)

        last_sent_ts = 0.0
        try:
            while True:
                jpeg, ts = producer.snapshot()
                if jpeg is None:
                    await web.asyncio.sleep(0.02)
                    continue
                if ts == last_sent_ts:
                    # No new frame; small async wait
                    await web.asyncio.sleep(0.01)
                    continue
                last_sent_ts = ts

                await resp.write(BOUNDARY + CRLF)
                part_headers = b"".join(
                    [
                        b"Content-Type: image/jpeg\r\n",
                        f"Content-Length: {len(jpeg)}\r\n".encode(),
                        CRLF,
                    ]
                )
                await resp.write(part_headers)
                await resp.write(jpeg)
                await resp.write(CRLF)
                # Optional throttle per client if desired:
                await web.asyncio.sleep(0)
        except (ConnectionResetError, asyncio.CancelledError, BrokenPipeError):
            pass
        finally:
            try:
                await resp.write_eof()
            except Exception:
                pass
        return resp

    app = web.Application()
    app.add_routes(routes)
    return app


def parse_args():
    p = argparse.ArgumentParser(description="MJPEG web server for Raspberry Pi Zero W CSI camera")
    p.add_argument("--host", default="0.0.0.0", help="bind host (default: 0.0.0.0)")
    p.add_argument("--port", type=int, default=8080, help="bind port (default: 8080)")
    p.add_argument("--width", type=int, default=640, help="frame width (default: 640)")
    p.add_argument("--height", type=int, default=480, help="frame height (default: 480)")
    p.add_argument("--fps", type=int, default=10, help="target FPS (default: 10)")
    p.add_argument("--awb", default="auto", help="AWB mode (default: auto)")
    p.add_argument("--ae", default="auto", help="AE mode (default: auto)")
    return p.parse_args()


def main():
    args = parse_args()
    producer = FrameProducer(args.width, args.height, args.fps, awb=args.awb, ae=args.ae)
    producer.start()

    # Graceful shutdown
    def handle_sigterm(signum, frame):
        producer.stop()
        sys.exit(0)

    for sig in (signal.SIGINT, signal.SIGTERM):
        signal.signal(sig, handle_sigterm)

    app = build_app(producer, args.host, args.port)
    web.run_app(app, host=args.host, port=args.port, access_log=None)


if __name__ == "__main__":
    main()

Make it executable:

chmod +x ~/mjpeg-streamer/app.py

Build/Flash/Run Commands

1) Verify camera and permissions:
– Users needing camera access must be in the video group (pi is typically already a member):
groups
# If 'video' missing:
sudo usermod -aG video $USER
newgrp video

2) Run the server manually:

cd ~/mjpeg-streamer
source .venv/bin/activate
./app.py --host 0.0.0.0 --port 8080 --width 640 --height 480 --fps 10

3) Access the stream:
– HTML preview: http://rpi-zero.local:8080/
– Raw MJPEG: http://rpi-zero.local:8080/stream.mjpg

4) Optional: systemd service for autostart
Create /etc/systemd/system/mjpeg-stream.service:

[Unit]
Description=Raspberry Pi Zero W MJPEG streaming server (Picamera2)
After=network-online.target
Wants=network-online.target

[Service]
Type=simple
User=pi
WorkingDirectory=/home/pi/mjpeg-streamer
ExecStart=/home/pi/mjpeg-streamer/.venv/bin/python /home/pi/mjpeg-streamer/app.py --host 0.0.0.0 --port 8080 --width 640 --height 480 --fps 10
Restart=on-failure
RestartSec=3
AmbientCapabilities=CAP_NET_BIND_SERVICE

[Install]
WantedBy=multi-user.target

Enable and start:

sudo systemctl daemon-reload
sudo systemctl enable mjpeg-stream.service
sudo systemctl start mjpeg-stream.service
sudo systemctl status mjpeg-stream.service --no-pager

Step-by-step Validation

1) Confirm OS and Python:
cat /etc/os-release => Bookworm
python3 --version => Python 3.11.x

2) Validate camera hardware:
libcamera-hello -t 2000 should not error.
libcamera-jpeg -o test.jpg and open test.jpg locally.
– If errors occur, revisit cable orientation and /boot/firmware/config.txt.

3) Validate Picamera2 imports:
python -c "from picamera2 import Picamera2; print('Picamera2 OK')"

4) Run the server (foreground) and observe logs:
./app.py --port 8080
– Note initial CPU usage with top or htop. At 640×480@10fps, Zero W should handle streaming to a single client.

5) Verify endpoint behavior:
– HTML:
– Visit http://rpi-zero.local:8080/
– You should see a moving image, with latency under ~300 ms depending on Wi-Fi.
– Raw stream (headers):
curl -v http://rpi-zero.local:8080/stream.mjpg --output /dev/null
Look for Content-Type: multipart/x-mixed-replace; boundary=FRAME.

6) Confirm JPEG frame boundaries:
– Peek at the first bytes:
curl -s http://rpi-zero.local:8080/stream.mjpg | head -c 2048 | hexdump -C | head
You should observe MIME part headers, then JPEG magic bytes ff d8 (SOI) and ff d9 (EOI) between frames.

7) Test with VLC:
– Media -> Open Network Stream -> URL: http://rpi-zero.local:8080/stream.mjpg
– Playback should start within 1–2 seconds.

8) Load test lightly:
– Open the MJPEG URL in two browsers simultaneously.
– Observe CPU load. If the Zero W struggles, reduce FPS (--fps 8) or resolution (--width 424 --height 240).

9) Verify systemd service:
sudo systemctl restart mjpeg-stream.service
sudo journalctl -u mjpeg-stream.service -e --no-pager to see logs.
– Confirm autostart by rebooting:
sudo reboot
After boot, check: systemctl is-active mjpeg-stream.service.


Troubleshooting

  • Camera not detected / libcamera errors:
  • Re-check ribbon cable orientation and seating.
  • Ensure GPU memory is at least 128 MB: grep gpu_mem /boot/firmware/config.txt.
  • On Bookworm, avoid enabling the legacy camera stack; use libcamera/Picamera2.
  • Confirm user in video group: groups.

  • ImportError: cannot import Picamera2:

  • Ensure python3-picamera2 is installed via apt, and your venv was created with --system-site-packages.
  • Test outside venv: python3 -c "from picamera2 import Picamera2".

  • Poor performance / high CPU:

  • Lower --fps to 8 or 6.
  • Use smaller frame size, e.g., --width 424 --height 240.
  • Place the Pi and client close to the Wi‑Fi AP; the Zero W only supports 2.4 GHz.
  • Power-throttling can harm performance; use a solid PSU and short cable.
  • Consider heat: small heatsink can stabilize prolonged operation.

  • Frame drops / stutter with multiple clients:

  • The example serves the “latest frame” to each client; under load, each client may see occasional skipped frames (expected).
  • If you need per-client buffering, implement a per-client queue and consumer, or move to an encoder-fed broadcaster (see Improvements).

  • Stream won’t display in browser:

  • Ensure the MIME type is correct and the boundary is exactly FRAME.
  • Confirm there’s no caching proxy between client and Pi; we set no-cache headers.
  • Try another browser or VLC to isolate client-side issues.

  • Port conflicts:

  • If 8080 is in use, pick another port: --port 8081, and update firewall/NAT if applicable.

  • mDNS issues (rpi-zero.local not resolving):

  • Use the IP directly: hostname -I to find it (e.g. http://192.168.1.50:8080/).
  • Ensure avahi-daemon runs (typically present in Raspberry Pi OS).

  • Camera exposure/white balance issues:

  • Adjust --awb and --ae or use picam2.set_controls for fine control (exposure_time, analogue_gain, etc.). For night scenes, reduce FPS (longer exposure).

Improvements

  • Use Picamera2’s MJPEGEncoder to reduce CPU:
  • Picamera2 can encode MJPEG in the camera pipeline. Implement a custom FileOutput sink that splits frames and distributes them to clients. This avoids Python-level RGB→JPEG conversion for every frame and can significantly lower CPU load on the Zero W.

  • Alternative pipeline via libcamera-vid:

  • libcamera-vid supports --codec mjpeg to write an MJPEG elementary stream to stdout. You can spawn it as a subprocess and demux frames into an HTTP boundary stream in Python. This is efficient and offloads JPEG compression.

  • Add authentication and TLS:

  • Put NGINX (or Caddy) in front of the stream for HTTPS and HTTP Basic Auth.
  • Or implement token-based auth in the aiohttp handlers.

  • System hardening:

  • Run under a dedicated user with minimal privileges.
  • Bind only on LAN interfaces (--host 192.168.x.y) or behind a reverse proxy.

  • Adaptive quality:

  • Measure outbound bitrate and drop resolution/FPS or raise JPEG compression on-the-fly when clients are slow or CPU load is high.

  • Snapshot endpoint:

  • Add /snapshot.jpg that serves a single JPEG (useful for integrations that poll stills).

  • Metrics:

  • Track per-client throughput and FPS; expose via /metrics (Prometheus format).

  • H.264/HLS alternative:

  • For bandwidth-constrained networks, consider H.264 with hardware acceleration and serve HLS or RTSP. Note: the objective here is MJPEG, but for production use and many clients, H.264 often performs better.

Final Checklist

  • Hardware
  • Raspberry Pi Zero W powered with a stable 5V/2.5A PSU.
  • Camera Module v2 connected with the correct Pi Zero camera cable; ribbon properly oriented and latched.
  • Optional heatsink applied for continuous operation.

  • OS and interfaces

  • Raspberry Pi OS Bookworm Lite (32-bit) installed and booted.
  • GPU memory set to 128 MB in /boot/firmware/config.txt.
  • Camera usable with libcamera: libcamera-hello and libcamera-jpeg succeed.

  • Python and dependencies

  • Python 3.11 in use.
  • Picamera2 installed via apt; venv created with --system-site-packages.
  • Pip packages installed: aiohttp, Pillow.
  • Optionally installed via apt: python3-gpiozero python3-smbus python3-spidev.

  • Application

  • app.py created in ~/mjpeg-streamer and executable.
  • Server runs: ./app.py --host 0.0.0.0 --port 8080 --width 640 --height 480 --fps 10.

  • Validation

  • HTML: http://rpi-zero.local:8080/ shows live video.
  • Raw MJPEG: http://rpi-zero.local:8080/stream.mjpg works in VLC/curl.
  • Health endpoint: /healthz returns JSON.
  • Acceptable CPU load for intended FPS/resolution.

  • Service

  • systemd unit installed, enabled, and active.
  • Service survives reboot; logs accessible via journalctl.

By following the above, you have a reproducible, fully working web-server-mjpeg-streaming-csi solution on the Raspberry Pi Zero W + Camera Module v2, tailored for Bookworm and the modern libcamera/Picamera2 stack. Adjust resolution/FPS/quality as required to balance CPU usage, latency, and image quality for your application.

Find this product and/or books on this topic on Amazon

Go to Amazon

As an Amazon Associate, I earn from qualifying purchases. If you buy through this link, you help keep this project running.

Quick Quiz

Question 1: What is the primary objective of the project described in the article?




Question 2: Which operating system is recommended for use with the Raspberry Pi Zero W in this project?




Question 3: What type of camera is used in this project?




Question 4: What is the required power supply for the Raspberry Pi Zero W?




Question 5: Which version of Python is mentioned in the article?




Question 6: What is the experience level required for this project?




Question 7: What type of networking access is needed for the Raspberry Pi Zero W?




Question 8: What is the camera cable type required for connecting the Camera Module v2 to the Pi Zero?




Question 9: What is the minimum microSD card capacity recommended for this project?




Question 10: What does the article suggest about the Raspberry Pi Zero W's CPU architecture?




Carlos Núñez Zorrilla
Carlos Núñez Zorrilla
Electronics & Computer Engineer

Telecommunications Electronics Engineer and Computer Engineer (official degrees in Spain).

Follow me:
error: Contenido Protegido / Content is protected !!
Scroll to Top