Retro graphics are something I’ve adamantly defended since long before it was cool, and indeed essentially ever since LCD replaced CRT. First, it was "but what we have now is BETTER." Then, it became a matter of a flood of thinly-masked laziness that no one seemed to care about. Later, it became a matter of people having simply forgotten how everything looked. Finally, it became an honest matter of so much time having passed that people legitimately had been born after CRT all but disappeared and legitimately having never seen it at all. You ever feel like you’re slowly drowning in being called various forms of crazy? And then, suddenly, being vindicated? That vindication came to me earliest in the form of RetroTVFX for Unity because someone else GOT it. Then CRT Pixels got an explosion of attention on Twitter (PRO Pixels is a similar account also worth following). Then suddenly CRT sets, or at least the super desirable premium Sony Trinitron sets, became a gamer luxury item. And, like, you ever just feel that moment where you’ve been right all along and while nobody directly acknowledges it, you don’t care and are able to just bask in the warm glow?
So let’s talk about the warm glow of a CRT and why it’s just so blasted hard to reproduce.
What is CRT?
CRT stands for Cathode Ray Tube and the principle isn’t entirely dissimilar from a flourescent light bulb, if the whole point wasn’t necessarily to light the whole thing up as brightly as possible. The name is derived from the fact that the entire thing is an electron gun that uses a magnetic yoke to quickly sweep it in lines across a screen into the waiting phosphors slathered on the viewing end. Color CRT is just that with 3 beams instead of 1 and a mask to keep the individual beams from hitting the wrong colored red, green, and blue phosphors by filtering out their individual angles as needed.
A CRT is an analog format purely concerned with lines and the brightness along the path of those lines is dependent on the voltage supplied to the electron gun by whatever. When I say "whatever," what I mean is something has to provide the signal. CRT is dumb technology, like a corded landline phone. There’s no computer to speak of in there making it work. All that processing has to happen somewhere else. This glosses a bunch over, but suffice to say there’s some timing blips that happen and then the juicy bits of the actual picture before the whole thing repeats at a 50% offset to fill in the blanks the first pass of the picture left, rinse and repeat.
So basically the takeaway is it’s sort of like waving a flashlight at a wall only the flashlight has a dimmer switch and by controlling it you draw Elvis.
Now, when I say the TV needs something to pass the signal, what I mean is the TV literally just does what it’s told and really has no sense of its own state, so appropriate hardware can tell it to do pretty much anything. It can stop at any point midway and as long as the TV gets a fresh set of timing blips it’ll just start over. It’s completely up to the outside hardware to choose how to time that and that’s no more readily apparent than with game consoles, which would often cut off the picture once it got into overscan and doubled the framerate by simply not filling in the blanks left by the first pass, which is where we get scanlines from. The systems doubled the framerate by only drawing half as much of the picture and nobody really cared all that much about the scanlines, provided they really noticed in the first place. Playing from any distance your parents won’t yell at you for ruining your eyes at, they’re not actually all that apparent; the screen just looks a little dimmer, not like someone drew across it in Sharpie.
Why is it so hard to reproduce?
To start this off, because LCD knows what a pixel is. Okay, so there are a LOT of factors in this, but CRT first off has zero concept of a discrete point of light much less a square one like a pixel is. CRT is all about the light of phosphors glowing in a way that is decidedly not square and in fact not even something that strictly has defined corners despite being nominally rectangular. A CRT is emissive in the exact same way a CFL is emissive, with diffuse light. It’s something that you simply cannot easily fake on a modern screen ironically because modern screens are not accurate enough to replicate the inaccuracy in a lossless manner. See, the common LCD screen does not use additive light phosphors; it uses subtractive liquid crystal filters. This means that the white balance is your peak and everything goes down from there. In contrast, CRT is all about your black level and can only go UP from there, so whatever voltage you pump into it is what the electron gun will shoot, from a solid "nothing" to "ka-me-ha-me-HAAAA!" OLED feels very promising to me personally because it’s also emissive and can thus do the same basic thing CRT does by adding rather than subtracting light, and, at least with a high enough DPI, doing it with a reasonable approximation of the CRT’s warm glow.
When I talk about the "warm glow," it’s a lot like how a vinyl fan will talk about the warm sound of a record over the more mechanical fidelity of a CD or the compression of an MP3. We live in an analog world and so the digital in many ways can feel striking in comparison. There are no "jaggies" on a CRT like you have on LCD because it all is smoothed out, so while the picture isn’t maybe as sharp, it also feels more natural in many ways and in others more dreamlike. People commented early on in the HD era about how things like The Sopranos took down the fourth wall because the actors’ stubble was suddenly visible and things felt too real and all their terrible deeds suddenly had less distance and fantasy to them. At the same time, a look at Final Fantasy X on a CRT made for a game that looks far more realistic than its own HD remaster, in no small part because the proportions were better and there was alpha blending, but also because a lot of the vertices were lost in the bloom of the screen. Older games looked fantastic on CRT because after a certain point, everything on it was equally real, whether it was because of gorgeous artistry or because someone said some lines in front of a camera. When technology and artistry have caught up to the limits of the format, the line between fantasy and reality can blur as much as you want it to. This doesn’t mean the limitations make the format bad. Obviously, with Serious Gamers™ gobbling up Trinitron screens because we’re just so used to jaggies all over the place as part of the "sharpness" because anti-aliasing is expensive and all of that smoothing just being a core part of the CRT format, the things people decried as limitations decades ago are now once again seen as features. But most of all, the warm glow of CRT is exactly that: a glow. CRT is defined by the glow of its phosphors as a light generator in three carefully chosen colors that give a natural light balance. There’s an interesting history of why these three phosphors were chosen (in reality, there were better colors for everything to give a wider gamut, but the phosphors that registered at the right red wavelength were too dim to keep up with the teal green and violet ones until it became too late to change and the final red was even considered briefly for the job before they adjusted green and blue around it to preserve the pure white), but ultimately they determined the RGB gamut that’s still the standard today, and they did it by being the best means to express our natural world when the technology was being established.
Ultimately, something that generates light is going to behave differently than something that blocks light, simply because when you generate light, you’re working with something additive rather than something subtractive like printing. In something generative, the choice of colors is made for an ideal path to a pure white, and the science behind it has long been settled. In something subtractive, you rely on there being a good enough white to subtract from, and in that regard, not all screens are created equal. Not in brightness; not in keeping a good neutral white; not even really in how they might age over time given many LCD screen backlights will shift bluer as they age, damaging your yellows and other warm colors in the process. Even if that were to happen to a CRT, every CRT screen has options to adjust things to fix it, and none of it relies on working around a single point of failure. Until a CRT dies completely, the screen is always some level of fixable, even if it requires some external tool like a degausser for magnetic damage. LCD ultimately is bad for emulating the look of CRT because it sought to solve the problem of CRT’s inaccuracy by going digital and while it succeeded in that, it means that reproducing the appearance of the analog format without notable loss requires the fidelity to be at a level beyond our ability to discern the compromises. This is easy enough in the living room at a standard 9-foot viewing distance, but breaks down considerably on PC where viewing distance is probably 3 feet at most.
No pixels? Really?
To put it this way, the game systems had pixels, but CRT does not. Any game system needs to create an image to pass somehow and the easiest way to do that is to form a raster image and then just sample it to finagle the voltages through whatever interface the system is using. Those voltages then get passed to the electron guns and tell them how hard to fire at a given point in a line. After that, it’s totally up to whatever phosphors happen to be between the electon beam and the viewer. The CRT doesn’t care. It doesn’t even know where the phosphors are. And the phosphors can be in any number of arrangements. When we talk about pixel art, we’re ultimately talking about the source, not the output.
When we talk about pixels, we’re talking about discrete squares of color that simply don’t exist on a CRT. In fact, when we talk about "square pixels" in terms of older systems, what we really mean is that something has done the work to keep the aspect ratio the same as that of the source raster image. CRT does not, by default, make "square pixels." CRTs stretch things vertically and it’s up to the artists to either compensate directly in their art assets or for the console itself to anticipate that vertical stretch and compensate using some resizing operation to make its source image wider before passing it along. The NES is pretty notable for having done the latter to create square pixels by default. Many systems failed to do so, which is why you see things like the incredibly broad and beefy Street Fighter sprites – Capcom had fantastic artists who knew what they were dealing with and compensated the stretch directly in their art.
What about scanlines?
To start, "scanlines" are misunderstood by many as lines of darkness in the image, when in fact the actual scanlines are the "scan" across the horizontal lines by the electron beam, i.e. the light part, and then from a hardware perspective. "Scanlines" as a visible artifact (which we’ll use from here on out) are a product of older hardware and they’re an intentional abuse of how TVs worked, by cutting out half the picture to get double the framerate. Only that’s not strictly true because in all both interlaced passes of the beam when both fields were drawn were potentially hitting at least some of the same phosphors at different spots depending on the arrangement and the way the phosphors get excited means there weren’t necessarily actual full gaps in the absence of the second field so much as slightly dimmer lines or maybe a linear pattern of some sort. Which isn’t to say sets with full gaps didn’t exist; just that it’s not quite so simple. Some or all of this may have been compensated for by the choice of colors, all things depending, and in fact colors are brighter on a CRT than you’d expect. CRTs are actually pretty bright and offer some excellent color saturation, which doesn’t always come through well on LCD with its blue-based backlight, especially for warm colors. For example, working in the NES palette, you’ll especially find its 2 brown tones are less "dark chocolate and bronze" and more "crayon brown and caramel." But scanlines overall are a complicated matter for many reasons.
There ARE things LCD can do to approximate certain aspects, but they’re all filters that remove light and darken the image. Scanlines can be approximated by darkening things like window blinds, but actual scanlines are not the same hard, readily apparent thing because the gaps between the lines are at a very sub-pixel level and the points of light generated by the phosphors may range anywhere from a visible gap to nearly touching. Accurately reproducing this requires you to choose an arrangement of fake phosphors (which vary drastically from model to model on a real CRT TV) and then essentially bloom them all individually, but in a way that doesn’t necessarily use the whole phosphor area so much as draw a circular gradient in it from some arbitrary line across it that’s going to represent the level of that color’s beam on that line. Also the phosphors are thin rectangles on any set someone generally cares about emulating. You can see already where this starts to get complicated even making a few assumptions. Thus scanlines cannot be accurately reproduced with a simple linear filter as one often sees because any such linear filter simply cannot account for the fact that a scanline is anything but a uniform thickness.
Even on a CRT itself it’s hard to approximate without using the same hardware tricks because anything you do isn’t going to accurately reflect the correct brightness because drawing fake lines across the screen to halve the line draws doesn’t actually double the line draws where they’re actually supposed to be, thereby reducing the actual brightness. There probably isn’t a single magic percentage of transparency you can throw on black lines to get it right. Then again, maybe I’m wrong and it’s as easy as a convenient 50% or 25%, but then you also have to consider that if you’re already using a CRT, you’re already on that original hardware. If you use RetroTVFX, then maybe you’re getting somewhere, and seriously, I cannot overstate how much of a treasure it is, but as of yet I haven’t really heard of it making its way into any notable releases. GET ON IT INDIE DEVS!
If you look at the rgb2x and rgb3x filters DOSBox offers as part of its video settings, they do in fact reproduce the color dots one would see on a CRT computer monitor, which has a very different setup to a TV due to the higher resolutions in play, but the filter likewise works by cutting out a lot of the light of an LCD. While the effect is incredibly authentic, even accurately reflecting the correct aspect ratio in the output, it ultimately results in a dingy image because the fidelity simply isn’t there on a 96DPI screen to get the sub-pixel rendering of the glow of the individual dots, never mind that the individual dots will simply not add up the way they would from an emissive source rather than a filter. Again, OLED would work wonders for this because OLED is effectively already doing it, but an LCD can only darken the base white. That said, it’s still the best way to play DOS games for ultimate autheniticity as things stand.
What about color?
Even ignoring the darkening aspect of scanlines, getting the colors to blend is HARD. Now, I will say that RetroTVFX absolutely worked the wizardry to account for this amid a slew of formats each with their own blending concerns, but for your average non-wizard solution, you can’t simply color some pixels and call it a day, because of that full-screen bloom effect I mentioned on your individual phosphor gradients. And that’s before you account for the various imperfections of the compression or faulty math of a given signal, which will create artifacts. If you look up the NES sprite of the Sahagin Prince, you’ll probably identify it as having a yellow and green color scheme, only because of the way the signal influences itself, in actuality, all of the yellow bits get very much greened out and it has a much more organic overall coloration that looks smoothly shaded. That and CRT simply doesn’t display colors the same way as LCD. Really! You know why so many characters, like say Jon Talbain, are blue instead of gray? Because blue is just a more reliable color on CRT than gray is. Grays can easily get bluer and at the same time desaturated blues can easily wash out, so sometimes just running with it was best because the worst you’d get is the grayer tones you may well have wanted in the first place. Anything that approaches achromatic also tends to pick up color from nearby depending on the video standard in use, and a lot of consoles used video standards that meant color bled like a stuck pig, especially red. The artists ultimately used this to their advantage much like any artist manages to turn the imperfections of their tools into features, but there’s simply no getting around the fact that viewing raw pixel art on a modern screen doesn’t in any way show the end result, not even the actual colors, much less how they ended up blended with dithering to produce whole new colors and alpha. This and the CRT’s vibrancy really means that you can’t really understand how pixel art looked without viewing it on original hardware.
Side note: dithering never stopped being a thing, finding a place in handheld games like Pokémon where it was positively chunky, and it’s now in use with HD games because the pixels of modern living room sets are small enough that from normal viewing distance, it’s not noticeable, but can prove more efficient than alpha blending in things like fading effects or just wherever you can get away with it. The difference is everyone thinks that dithering was ALWAYS chunky when in fact its modern use in HD is much more akin to its original use on CRT. The point of dithering is not to make a visible checkerboard and never has been. CRT blended it in the hardware and HD blends it in the limited resolution of our eyes.
What’s the solution?
The solution is ultimately in understanding how things looked and why and using modern technology to get as close as possible. As of right now? We really do have the technology. Sony’s handling of PS1 games on PS5 is incredibly promising because it shows they get it. There’s no one better equipped to understand the PS1 hardware than Sony themselves and the real question becomes who was involved in reproducing the look and how faithful they decided to be. We otherwise have emissive technology in the form of OLED and it’s becoming quite common specifically because of the brightness and color response it offers. Resolutions are high enough that an accurate bloom can easily take place if anyone cares to do it. 1080p may not be stellar at PC monitor distances, but it’s probably more than enough in the living room.
The biggest solution, though, is consumer demand. As I started this with, for years everyone pointed to box art and manuals showing the chunky pixels as evidence that everything was chunky pixels because those were easily documented. People generally didn’t take pictures of things on screens and even if they had, the early Internet would have made it an absolute bear to get them online. Even when the memory of CRT was fresh, the idea that it was by its nature inferior didn’t do it any favors. For many years, authenticity was not valued, and was later misunderstood. It’s only now that a few advocates are getting interest. If that interest becomes demand, as it seems to slowly be doing, then authenticity will become a selling point and encourage more creators to honor it.