How the HL1606 works

I've been asked by The Travelling Light Circus to make some radio-synchronized LED strips for them to use at the Big Chill Festival. I'm using 5V multicolour strips that I'm going to control with an Arduino. The strips are driven using the fairly commonly used HL1606 chips. However although these are commonly used, there's not much information about how they work, There's a project on google code that contains an Arduino library for driving the strips. That library was used to implement a flashing headband. However the library drives the strips using simple software delay loops and can only make the patterns ripple in one direction, from the end of the strip the controller is attached to. I need to drive at least four strips simultaneously, plus a radio, so anything that relies on delay loops is a non-starter. I also need to have patterns that go in any direction, including outwards from the centre of the strips.

The headband project contains the only english HL1606 datasheet I've been able to locate. The datasheet looks like it has been translated from chinese and is pretty unclear in some places, but it is enough to get going. One thing that immediately caught my attention was that it says the HL1601 is "SPI controlled". SPI is a serial bus protocol that is supported in hardware by the Atmel AVR CPU used on the Arduino boards, so it appeared I might be able to offload much of the work of driving the strips to the hardware support. Unfortunately, that claim is only partly correct, as I'll explain in a subsequent post.

What follows below is a combination of information from the datasheet with clarifications and corrections, because in some areas of detail it is wrong.

The HL1606 has two parts, two shift buffers (A & B) and two output drivers (A & B). Data on the input pin is shifted into the shift buffers, and then copied, under the control of the /SS line, into the output buffers to drive the two LEDs connected to the chip. In typical use such as a LED strip the HL1606s are daisy-chained together, so the output of one chip drives the input of the next. As data is shifted into each chip, it shifts the contents of the A buffer into the B buffer, and the contents of the B buffer are passed on to the next chip in the chain. Therefore if we wanted to set a 20-LED, 10-chip chain we'd shift 20 bytes into the end of the chain and they'd propagate down it. The pins we'll have to drive are as follows:

Software-accessible pins
Datasheet pin nameSPI signal namePurpose
D-IMOSIData input
CK-ISCLKData clock
L-I/SSSlave select (active low)
S-IFade clock

I'll cover the details of how to use the AVR's hardware SPI support in a later post, while was investigating how the HL1606 worked I just used simple digitalWrite() calls to drive the bus. The first three pins are the standard SPI ones - MISO is missing from the table because the HL1606 only consumes data and doesn't create any. Fade clock is the pin that's used to control the speed of the LED fades, more on that below.

LED control byte format
0Blue LED control
2Red LED control bits
4Green LED control bits
6Fade rate bit, 0 = slow (127 steps), 1 = fast (63 steps)
7Buffer latch bit, 1 = latch, 0 = don't latch

Each HL1606 drives two LEDs, each of which requires one control byte. The particular strips I'm using have the LEDs wired up in (blue, red, green) order but other strips may have a different order, you'll have to experiment to find out.

To drive data to the strip you need to do the following, I'm using the SPI pin names rather than the ones on the datasheet:

  1. Set /SS low to select the chip.
  2. For each control byte to be sent, send the bits in MSB order as follows
    1. Transfer a bit to the chip by setting the MOSI pin high for 1, low for 0
    2. Send a clock pulse (high/low) on SCLK
  3. Set /SS high to deselect the chip.

It is important to pull /SS low at the start of the transfer and only pull it high after transferring all the bytes. For example, if you want to set just the fifth LED to red you'd pull /SS low, send a 'red' byte followed by four 'off' bytes, then pull /SS high. This is necessary because the state of /SS governs what is done with the data shifted in to the chip. If /SS is active the chip transfers the data along the shift buffer chain but doesn't copy it into the output drivers, so nothing is visible as the data is being shifted down the chain. When /SS is pulled high the current data is transferred from the A & B shift buffers in to the corresponding output drivers and the LEDs are lit appropriately. The library on google code doesn't do this, it drives /SS low/high for each byte that's transferred which causes an undesirable flickering effect.

Each LED uses two bits to specify its settings, given in the table below.

LED control bit format
Bit 0Bit 1Definition
00LED off
01LED on, no fade
10LED off, fade up on fade clock
11LED on, fade down on fade clock

The first two combinations are simple, the LEDs are either fully on or fully off. Although the HL1606 can fade between colours (yes, I'll get to that in a bit) it can't display a full RGB gamut as the fade clock is shared between the RGB LEDS and affects them all equally. That gives a total of eight possible colours (if you count 'off' as a colour). The available colour combinations are as follows.

Available colours
RedGreenBlueVisible colour

Fading is possible in both the 'up' and 'down' directions by specifying one of the fade bit combinations above. Fading between any of the eight possible colours can be achieved by the appropriate combination of individual colour fades, for example to fade from red to cyan you'd send a control byte of (red fade down, green fade up, blue fade up) - see the tables above. The speed of the fade is controlled by a clock pulse supplied on the fade clock pin - the faster the clock, the faster the fade. As I've said, the fade clock pin is shared by all the RGB segments of all the LEDs in the strip. Two fade rates are possible, 127 steps or 63 steps, specified by the fade rate bit in each LED's control byte. Note that the datasheet says there are 128/64 steps but that's incorrect. Note also that LED segments that are set to fade up don't start out being off, they start at the first rung of the brightness ladder and fade up from that. Note also that the perceived brightness of the fade ramp is not linear, so although it is possible to use it to control overall brightness it's only at a fairly coarse level.

The last important thing to note is the purpose of the Buffer latch bit in each LED control byte. The google code library refers to it as the 'whitespace' bit, which isn't particularly accurate. For normal use, the bit should always be set, otherwise when starting with a fully 'off' strip the data shifted down the chain will have no effect and nothing will be visible. When the bit is unset, the rest of the bits in the control byte are ignored and the effect is to cause the chain to shift everything along by one LED, with the first LED retaining its current setting. For example, if the chain is currently displaying (R,G,B), sending two bytes with the latch bit unset followed by a blue byte with it set would result in (B,R,R,R,G,B) being displayed.

There are more details that I haven't covered in this post such as how to drive the strip with the AVR's hardware support, how to do patterns that go in both directions along the strip, how fast you can reliably drive the strip (not the same as the datasheet), how to drive multiple strips simultaneously whilst sharing pins between them, how the HL1606 actually isn't a SPI device at all and how to make it behave as one - but I'll cover those in later posts so check back for more. In the meantime, here are two videos of the strips in action, as a taster :-)


Categories : AVR, Tech

Re: How the HL1606 works

Sorry, reposting this from hackaday so feel free to delete this comment if you can e-mail me! Trying to make a glow staff out of pixel RGB strips so would appreciate the following information!

I'm wondering if someone has tried bypassing the custom chip commands and outputting directly to the pwm. I would be really grateful for information about what exactly is required from say the Arduino end to drive your own colour signal. Also the information about the latency involved in passing such a signal through the strip would be greatly appreciated!

Re: How the HL1606 works

You can't 'bypass' the chips, they decode the signal that's shifted down the chain to drive the LEDs and there's no way of getting at the internal PWM functionality that the chips used to provide fading, You could try implementing your own PWM by driving the LEDS either fully on or off and then doing that quickly enough to avoid flicker. I'm clocking a short 20-LED length at 2Mhz which is faster than the spec, I suspect you wouldn't get away with that for a full 200-LED strip. The spec says the fastest you can set all 200 LEDS is about 1msec, Assuming you want to keep above 30Hz to avoid flicker, that would only give you about 5 brightness levels per LED.If you used shorter lengths and overclocked as I have, you could obviously do better than that, but it's never going to compete with a strip that uses ICs that are designed to provide full control of the brightness of each LED.

Re: How the HL1606 works

Hi Alan

Would you mind sharing your Arduino code for doing patterns that go in both directions along the strip?



Re: How the HL1606 works

Hi PJ, I don't have simple code shows how it's done but the principle itself is really simple, see this post, where I explain how it is done.

Re: How the HL1606 works

I think you have the control bits reversed up above... bit 0 on bit 1 off is LED on, bit 1 on bit 0 off is fade in, on my strip. This seems confirmed by the datasheet up at adafruit.

Re: How the HL1606 works

No, the table above is correct, note the bit ordering in the table is (0,1). The datasheet orders bits from LSBit to MSBit in some places and from MSBit to LSBit in others. The datasheet is, at best, confusing :-)