Way back a lot of computers had a option to use a composite video source to a TV to be used as a monitor. Over time this became uncommon then came back again in the 90s and up until HDMI was becoming standard.
So what is the deal? Well VGA is a analog video signal unlike CGA and EGA that are digital. Old TVs are analog so I thought maybe I could build a adapter instead of dishing out some cash for a proper adapter box. Sure enough there are plenty of circuits floating around on doing this. The most simplest circuit I found uses a 680ohm resistor and the red gun signal of the VGA producing a monochrome (B&W) picture. Fine for me since I wanted to use it as a command line interface terminal. I forced my test machine to use the most lowest possible resolution and tried it out. It some what works and had to swap out the fixed resistor to a 1K pot to get a better picture however I learned real quick that NTSC uses a different clock signal then VGA. I think NTSC uses around 15KHz and VGA uses 31KHz for 640×400. The picture was mirroring as in showing three pictures, left and right were cut off and the center was crushed in vertically. After digging around I found a super old website where some guy tossed in a B&W CRT into his desktop computer. He modified the video card’s ROM to use a slower clock. Later on I found a webpage that covered to force a video card in software via Linux and Xorg to use a slower speed but it only listed some old ATI and Matrox cards that I don’t have anymore and I doubt the cards are compatible in modern distros of Linux since over the years they have cut out thousands of compatible hardware from the Kernel.
I kinda knew I wasn’t going to get far but it was fun to try. Looks like I’ll have to ether buy a converter adapter or maybe reprogram my MaxiMite clone to be used as a serial terminal.