9 回答
When I explain scanlines to friends during a retro gaming night, I usually put it like this: CRTs draw images with a moving beam, row by row, and older standards used interlaced fields. That means half the picture is refreshed one moment and the other half the next, so if you stop time with a camera or if your capture device doesn't sync, the gaps show as dark horizontal lines.
It’s also about persistence and recording: phosphor brightness fades after being hit by the beam, and cameras sample frames at their own rates, creating beat frequencies and moiré that make the lines pop. Add VHS or composite encoding into the mix, which blurs fine details and compresses color differently from brightness, and those scanlines get more pronounced. Personally, I kind of appreciate the look — it tells me the footage is from a different era, and it feels authentically nostalgic.
I like to experiment, so I’ve played with a CRT, a DSLR, and a capture card to see scanlines show up under different conditions. What I observed was clear: with a fast shutter and no sync the camera freezes the phosphor’s glow per scan, making each beam pass visible as a line. Slow shutter speeds blur those lines into a smoother image. If you record directly from the video output instead of filming the screen you still get interlacing artifacts if the signal is 480i — fields captured separately lead to line-like banding, especially in areas of high contrast.
This is why retro enthusiasts sometimes prefer authentic scanlines in emulators — they reduce harsh pixelation and emulate how CRTs naturally blurred and blended pixels. I’m always torn between cleaning footage and preserving that analog texture; personally I love a little liney warmth in my retro captures.
If you stare at old VHS tapes or capture cards from the 80s, those horizontal black stripes become part of the vibe — but they aren't decoration, they're physics. A CRT paints the picture one thin horizontal sweep at a time: an electron beam scans across the glass, lighting phosphor lines that glow briefly. Older TVs often used interlaced video, which splits each frame into two fields (odd and even lines). When you see the fields separated or when a camera records the display without syncing to the TV's refresh, the alternating lines become visible as scanlines.
There's another layer to it: recording tech and video standards. Film or early camcorders interact with the CRT's refresh rate and phosphor decay, producing flicker or banding. Composite signals and VHS also limit horizontal and vertical resolution and smear chroma/luma differently, which emphasizes those dark gaps. Even the shadow mask or aperture grill inside the tube can make the beam land in strips, so the screen’s physical construction plays a role too. I love how those imperfections give old footage character — like a cozy grain that screams 'retro' every time I watch it.
I once had to digitize a trove of old VHS recordings for a small indie archive, and scanlines became my regular puzzle. Practically speaking, the lines were most pronounced when the CRT’s refresh and my camera’s shutter weren’t locked — that mismatch let me literally see the beam move. VCRs compounded issues with uneven tracking and noise, and composite video blurring reduced detail so the dark separations between phosphor lines showed up more starkly.
My workflow was to use a capture device with genlock when possible, match the output frame rate (progressive capture from interlaced sources can be handled by proper deinterlacing), and clean up with temporal smoothing tools. Sometimes field blending preserved motion better than bob deinterlace. But aesthetically I learned to respect the lines: they tell a story about old tech and the era’s limits. Digitizing taught me to treat scanlines as historical fingerprints rather than mere defects, a little nostalgic scar of analog days.
Watching old broadcasts with scanlines feels like peeking through a window stitched by time. The simplest root cause is that CRTs don’t paint an entire frame at once — they trace thin horizontal lines with an electron beam, and older video used interlaced fields, so half the lines arrive at a slightly different moment than the others. When capture devices or film cameras don’t sync perfectly to that timing, the dark gaps between the illuminated lines show up as stripes.
There’s also the tube’s hardware — phosphor glow, shadow masks, and the way composite signals reduce detail — which all make those lines more visible. I kind of treasure that imperfect texture; it makes old footage feel lived-in and warm.
Those thin dark bars you see in old CRT footage come straight from how those TVs actually painted the picture. A cathode-ray tube draws an image by sweeping an electron beam across the screen line by line — a raster. On early television that sweep was interlaced: the set drew every other line (a field), then went back and filled the gaps with the next field. When you film or digitize that with equipment that isn’t locked to the TV’s scan timing, you end up capturing the momentary gaps between fields or the faint phosphor decay, which shows up as scanlines.
There’s also a texture to the phosphor and the shadow mask or aperture grille inside the tube that makes each horizontal pass look distinct. Old cameras, especially when used to record a CRT off-screen, introduced rolling shutters, frame-rate mismatches, and sometimes even aliasing. VCRs and analog recording added their own jitter and noise. So what you’re seeing is a mix of electron-beam geometry, interlaced fields, and the imperfect way analog gear and later digital cameras sampled that signal — it’s science that also looks oddly cinematic, and I kind of love that analog vibe.
Technically speaking, scanlines are a natural consequence of raster scanning and video timing. A CRT rasterizes the image: an electron gun sweeps horizontally to create discrete lines, with vertical retrace (the beam moving back to the top) between frames. In analog broadcast systems like NTSC or PAL, interlaced scanning sends alternate line fields to double perceived motion smoothness using limited bandwidth, so one field contains odd-numbered lines and the next contains even-numbered ones. If you capture or transfer interlaced video without proper deinterlacing, the field separation becomes visible as stripe artifacts.
Recording methods aggravate the effect. Film cameras or early video recorders sample at frame rates that can beat against the CRT refresh, producing temporal aliasing and banding. The CRT’s phosphor persistence, the shadow mask or aperture grill geometry, and the limited horizontal resolution from composite encoding all contribute. Modern deinterlacing, temporal filtering, and high-res digitization can obscure or remove these lines, but I still find the raw scanline texture oddly satisfying—it speaks to the engineering limits of a gorgeous age of display tech.
Quick technical rundown: scanlines originate from the CRT’s raster scan and interlaced output. The CRT paints horizontal lines sequentially using an electron beam; the phosphor dots or stripes glow briefly and then decay. In interlaced systems each frame is split into two fields (odd and even lines), so if capture equipment isn’t synchronized (no genlock) or the shutter speed conflicts, the difference between fields becomes visible as dark lines. Add in analog recording artifacts, limited vertical resolution, and possible aliasing from sampling, and those lines become even more pronounced. I find the mixture of electrical engineering and visual texture fascinating and kind of satisfying to watch.
Back when I played games on actual CRTs, scanlines felt like part of the aesthetic rather than a flaw. From my perspective as a gamer, those horizontal lines are just the visible footprint of how the TV rendered pixels: the electron beam drew lines, phosphors glowed, then faded. When modern cameras or capture cards try to record that, they often can’t sync perfectly with the display’s refresh, so you see the beam’s progress as brighter and darker stripes. Interlaced formats like 480i add to it because frames are sent in two fields; if the capture device treats those fields incorrectly you get distinct lines.
I also noticed that emulators sometimes add simulated scanlines intentionally because they smooth jagged pixel edges and make sprites look more cohesive on high-resolution displays. If you want to remove scanlines when capturing, deinterlacing and matching frame rates helps a lot — but sometimes I prefer to keep them for nostalgia, it’s part of the charm that reminds me of late-night gaming sessions.