This page documents our adventures with simulated NTSC/CRT effects, typically using the NTSC-CRT library by LMP88959. We have condensed this library to a single header and inserted it into the rendering paths of various softwares to great and amusing effect.
This was a relatively quick single-commit job, which breaks down roughly as follows:
First, we included the ntsc.h single-file header mentioned above and shimmed an additional framebuffer (ntsc_buffer), as well as the NTSC_SETTINGS and CRT structs into the start of doomgeneric_xlib.c.
16 | #ifdef DG_NTSC |
17 | #define NTSC_C |
18 | #include "ntsc.h" |
19 | |
20 | static uint32_t* ntsc_buffer = NULL; |
21 | static struct NTSC_SETTINGS ntsc; |
22 | static struct CRT crt; |
23 | static int ntsc_field = 0; |
24 | #endif /* DG_NTSC */ |
Next, in the DG_Init() function, we allocated the new framebuffer (ntsc_buffer) to hold a 32-bit word for every pixel on the screen. We then setup the CRT struct to point to that framebuffer for its output and turned on scanlines, and we setup the NTSC_SETTINGS struct with the sizing and pixel format information for the incoming framebuffer (DG_ScreenBuffer), which we also added a pointer to.
This is a pretty simple and intuitive pattern... The NTSC_SETTINGS struct processes the incoming framebuffer before passing the resulting data off to the CRT struct to display!
110 | #ifdef DG_NTSC |
111 | ntsc_buffer = malloc( DOOMGENERIC_RESX * DOOMGENERIC_RESY * 4 ); |
112 | |
113 | /* Initialize CRT buffer. */ |
114 | crt_init( |
115 | &crt, DOOMGENERIC_RESX, DOOMGENERIC_RESY, |
116 | CRT_PIX_FORMAT_RGBA, (unsigned char*)ntsc_buffer ); |
117 | crt.blend = 1; |
118 | crt.scanlines = 1; |
119 | |
120 | ntsc.data = (unsigned char*)DG_ScreenBuffer; |
121 | ntsc.format = CRT_PIX_FORMAT_RGBA; |
122 | ntsc.w = DOOMGENERIC_RESX; |
123 | ntsc.h = DOOMGENERIC_RESY; |
124 | ntsc.as_color = 1; |
125 | ntsc.raw = 1; |
126 | #endif /* DG_NTSC */ |
We also inserted the new framebuffer (ntsc_buffer) into the XCreateImage() call, so that we would be drawing the output of the CRT struct, and not the (now intermediate-stage) DG_ScreenBuffer framebuffer.
124 | s_Image = XCreateImage(s_Display, DefaultVisual(s_Display, s_Screen), depth, ZPixmap, 0, (char *)DG_ScreenBuffer, DOOMGENERIC_RESX, DOOMGENERIC_RESX, 32, 0); |
152 | s_Image = XCreateImage(s_Display, DefaultVisual(s_Display, s_Screen), depth, ZPixmap, 0, |
153 | #ifdef DG_NTSC |
154 | (char *)ntsc_buffer, |
155 | #else |
156 | (char *)DG_ScreenBuffer, |
157 | #endif /* DG_NTSC */ |
158 | DOOMGENERIC_RESX, DOOMGENERIC_RESX, 32, 0); |
Finally, in the DG_DrawFrame() function, we call the crt_modulate()/crt_demodulate() to apply the transformation on every frame. We also flip the field/frame bits between frames, as this seems to be necessary? Worth noting, as well, is that we hard-coded the noise value to 52 (which is rather high, but matches our preference). In a future refinement, this should be more easily configurable!
185 | #ifdef DG_NTSC |
186 | ntsc.field = ntsc_field & 1; |
187 | if( 0 == ntsc.field ) { |
188 | ntsc.frame ^= 1; |
189 | } |
190 | crt_modulate( &crt, &ntsc ); |
191 | crt_demodulate( &crt, 52 /* noise */ ); |
192 | ntsc_field ^= 1; |
193 | #endif /* DG_NTSC */ |
This is enough to give us the effect demonstrated in the video above, albeit in a rather quick and dirty fashion. One flaw that is immediately apparent is the strange discrepency that arises from the high output resolution. A real CRT using NTSC with that type of noise would not have a resolution of 640x480! We might investigate turning down the output resolution of the CRT struct, providing a more realistic scaled image.
This section is under construction!
Please stay tuned for updates pending research and development.
This video is much clearer than the previous examples, as the noise was turned down significantly. This was because, during out testing, noise over a certain level would start to cause vertical roll. This only seemed to be an issue in Windows, and we have not yet tracked down the cause.
The primary difficulties in the Windows implementation of the maug VDP had to do with figuring out how GetDIBits() and SetDIBits() worked... Long story short, one needs to fill out *all* the fields in the bitmap header passed to those functions- particularly the biPlanes field. maug now does this, so the VDP subsystem in maug now works on Windows, nominally.
Also note that the Windows version uses the CRT_PIX_FORMAT_BGRA format, as that is the byte order in Windows bitmaps.