Eurorack RGB controller for composite signal path

Hey Everyone,

I’ve been lurking on the LZX forums a little for over a year while gradually dipping my toes into video synthesis/glitch. I also started getting into PCB design for my job over the past year or so and I’ve been looking into designing some simple eurorack video modules based around a composite video signal path in my free time. Long story short I’ve successfully made a couple passive utility boards and am now nearing the end of a project where I cram Syntonie’s CV001, an upgraded Archer Video Enhancer circuit, into 10HP (Shoutout Syntonie for making their stuff open source). Well . . . by ending I mean ordering the boards and waiting until they show up to find out where I messed up (if they work I’ll post the details).

I’m scoping out the start of a new project to occupy my time while the boards are made and I really want a way to control the RGB colors of a composite signal. The current idea is to use a composite to RGB conversion circuit, run R, G, and B into 1K potentiometers with horizontal and vertical sync untouched, then convert everything back into composite for the output. I’m looking to use the Composite to RGB circuit found in the second post here and the RGB to Composite circuit found found in the " Building Your Own RGB To Composite Converter" section here.

I’m looking for any advice here that y’all are willing to give in regards to my choice of circuits, how to make things simpler/cheaper, and how best to make everything open source. I’ve been tinkering with electronics for a while now but mostly for my career and not for the open source DIY world.

I really enjoy the back and forth here on the LZX forums (especially everything surrounding Gen3), I appreciate the technical know-how and creative touches I’ve seen on here. I hope to add some info and flair myself as my projects develop.



Chroma decoding is a tall order, and that will be your biggest challenge in a circuit like this. There aren’t any off the shelf ICs for decoding like there are for encoding (like AD724), and it’s a complex, multi-part process involving regeneration of a genlocked subcarrier, demodulation of that subcarrier into UV components, and then YUV to RGB conversion. These days usually this is all done in embedded systems with ICs like TVP5150AM1 or ADV7181C (this is what we use to decode composite in the TBC2 module and Chromagnon frontend.)

CVBS decoding all analog is not impossible of course! Just not the easiest place to dive in, with CVBS circuits (on the contrary, probably the hardest.)

Maybe start with trying to separate Luma/Chroma so you can manipulate them separately, then recombine them. Once you have that kind of infrastructure, there’s a lot of CVBS-specific trickery you could get going. And it’s a good step towards a color decoder design.


Separating luma and Chroma is a great idea!
Naturally I’m looking to start small and build up, my inspiration for the RGB control came from LoFi future’s GBS-8100 circuit bend which I was able to do earlier this year but I’ve burned out 4 of the five boards I’ve tried this bend on (I’ve always had very shaky hands). I was hoping the MC1377 IC would be sufficient but if needed I may be able to get the TVP5150AM1 working.

1 Like

The MC1377 or AD724 or similar would work for converting RGB to Composite as long as it is already blanked and black/white level clipped, and you have separate sync signals to supply. (Check out Cadet II schematics as an example implementation, with blanking and clipping, and the Cadet I schematic as blanking generator.)

But it is an encoder IC, not a decoder IC, so won’t help you go from Composite back to RGB. That’s the hard part.

Implementing TVP5150AM1 will require integration with a BT.656 stream receiver of some kind – such as an FPGA – and then some kind of encoder IC (like ADV7393 or similar) to turn that back into analogue. That’s quite a project, involving high pin count TQFP parts. But some of those encoders/decoders are meant to be back-to-back as a stream. (Still likely will need some sort of MCU to configure the parts over I2C bus.)

Point being, decoding and then encoding is a huge amount of costly infrastructure for one device or module – but modifying the color channels of the composite signal? That’s possible a lot more directly, by modifying the Luma / Chroma of the CVBS signal directly (phase shift/amplitude for Chroma, and contrast/brightness for Luma.)


Ahhh I see, I was misinterpreting the CPC composite Video Converter schematic I found. I thought it was a decoder IC, my brain must be going fuzzy from too much time spent in the lab!

Thank so much for the tips, I’m glad I reached out before I got started on this!

I’ll take your advice and make a module with some potentiometers tied to Luma/Chroma and see where that takes me.


After sleeping on it, I realized there are other tricks beyond encoding to think about too!

A technique Dave Jones used with his colorizer was IIRC to generate Red, Green, and Blue static colors as source images (their own composite video signals.) By mixing those with another image, you have another way to modify RGB – but without having to resort to any CVBS encoding/decoding.


I did some research over the weekend and designed RevA of a 4hp solution to exposing Chroma and Luma.
I think this should work, going to order some parts and boards this week after cleaning it up a bit. I’m a little peeved I couldn’t find a through-hole replacement for the LT6206 Op Amp (the pitch on these pins looks tiny!).

The general idea is:
The input goes to a splitter
One of the two outputs of the splitter goes through a low pass filter to get Luma, the other goes through a high pass to get Chroma
Luma and Chroma go to distinct potentiometers for attenuation control
The attenuated Luma and Chroma get recombined then sent to the output.

If this works then I’ll look into other ways to modulate luma/chroma for more interesting effects.


Looks like a good start! Have you simulated the circuit using SPICE or built it up on breadboard/protoboard? This circuit should be easy to simulate using something like LT SPICE.

You will probably need to add buffering on the outputs of your filters – both after the filters themselves and after the attenuation pots. You’ll also need a 75 Ohm video output buffer before your final output, driven at double normal gain. Right now all of these parts are going to interfere a lot with your filters since they are part of the same unbuffered circuit. Looking at the AC analysis of the filter and the transient analysis of the output should show you what’s going on – if you’ve never done simulation before, it’s pretty accessible and incredibly useful!

You also need to change to a notch pass / band pass topology rather than low pass / high pass architecture if you want the full signal bandwidth, but the way you’re doing it could work for a lo-fi approach. You will just be missing some sharpness that may otherwise be present in the luma edges.


Thanks for the tips!

I am going to order the parts and try to bread-board it, I haven’t tried using spice before.
I’ll download it and poke around.

Still pretty new to all this, by “add buffering on the outputs of your filters” do you mean a current limiting resistor or do you mean implementing something like R5 after the filters as well as before the output?
Also can you explain how to drive the output at double normal gain? am I just adding in an amplifier?

Swapping out the highpass/low pass for a notch/bandpass is a good idea, but I may keep both and put a switch in to give it a “Clean” mode and a “Lo-Fi” mode. I think I’ll need to boost this to 6HP to include the buffers anyway . . .

1 Like

If you can figure out how to draft a schematic/board layout, you will figure out SPICE fast! Think of it like having a virtual platform to test that your circuits work before you have to build them. LT SPICE is free and the most popular. We use TINA a lot here. I started with 5SPICE (free version.) Just like any layout software, they’re all a pain in the butt in different ways. Just dive in somewhere and simulate something very simple to start, like sending a triangle wave into a potentiometer and view the output.

Still pretty new to all this, by “add buffering on the outputs of your filters” do you mean a current limiting resistor or do you mean implementing something like R5 after the filters as well as before the output?

I mean a unity gain opamp buffer, like this:

Also can you explain how to drive the output at double normal gain? am I just adding in an amplifier?

Here’s a rough approximation, it’s easier to show a block diagram:

Thanks for the tips!

I hope I can help! LZX exists because a lot of smart folks spent a lot of time mentoring me, it’s my duty to pass that on.

Note: What I’m proposing above wouldn’t be entirely accurate for luma gain – and there’s no consideration for sync or DC restoration. This is actually a very big circuit! It’s part of the reason for the 1V RGB signal format – if you try to handle CVBS signals on a per-module basis, the circuits get giant pretty fast just for all the infrastructure that needs to be in place. Designing stuff for DC coupled 1V RGB is a bit less annoying in that way. But you should absolutely keep pursuing what you’re working on and learn as much as you can. The most important bits for your Luma/Chroma splitter is going to be the cutoff frequencies on the filters and that you can pass the full gain from input to output drivers.


Happy to see CBV001 schematic got you into making your own stuff, even though it is a really simple circuit and uses some crude circuit bending to achieve the effects, so probably not the best example of a composite video processor. As Lars pointed out, proper composite processing involves a great amount of support circuitry (DC restore, Y/C separation, blanking, clipping, etc…) so circuit bending is kind of a shortcut to get something happening to the signal with a reduced amount of components. In that sense, Cadets schematics shows most of the required circuitry and been really helpful in my understanding of this subject.

For Y/C separation, looks like the most straightforward way is to use filters as you did. Here is a simple notch/hi-pass configuration from Elektor UK, so cutoff frequencies are set for PAL. It uses single supply/transistors buffers so signal get AC coupled at the output, but the filters could be put in between op-amps as you did/Lars shown to avoid AC coupling. Also the op-amp stages would allow to adjust the gain of the signals that would be a bit attenuated by the filters.

(Complete article available here videomagazines/CVBS to SVHS - September 2001.pdf at main · Syntonie/videomagazines · GitHub)
I tested it and it works decently, I wasn’t really satisfied with the fact that there was still chroma left in luma, some devices were still picking colors, so would probably have asked to remove the burst too.

This made go into the digital decoder/encoder rabbit hole, as this will give a clean Y and C, with the ability to choose between different digital filters, but asks to be set in I2C using a microcontroller, and some FPGA processing to keep the output in-sync with the input, ending in a fairly complex implementation.

Then if the goal is to recombine Y and C later in the circuit/system, a not perfect filtering is less of an issue. Here is a test I did with Y going to a single comparator (converting it to a 1-bit b&w signal) and adding C afterwards, with some modulation from 2x Castle VCO of the comparator threshold. (sorry for the crappy picture extracted from an old IG story)

Old composite video processors are a good resource to look for those kind of filters, here is the Vidicraft Proc Amp (NTSC) with the filter labelled as “Chroma Trap”. It’s pretty interesting as it is all discrete, from sync extraction to actual video processing.
There was a PAL adaptation of this circuit published in Progetto Elektor (Italy) videomagazines/ProgettoElektor_1987_07-08 Color Processor.pdf at main · Syntonie/videomagazines · GitHub so filter cutoff is set for PAL subcarrier.

Anyway, awesome to see your working on composite video processing, I slowly shifted to modular/LZX 1V RGB format as it is a bit more convenient to work with, then still have a few ideas for composite processing and this kind of thread is motivating to get back to it :smiley: