Driving the HL1606 using the Arduino's hardware support

In my last Arduino post I explained the basics of how the HL1606 works, if you haven't already read that I suggest you read it first.

The google code library that drives the HL1606 does so by bit banging the control lines. That's both slower and more CPU-intensive that doing it in hardware, and in my application, both speed and CPU usage are an issue. The Atmel AVR CPUs used on the Arduino boards have hardware support for SPI, and as I said in my last post, the HL1606 datasheet says it is SPI compatible. The Arduino libraries don't provide support for the CPU's SPI features, I had to implement it directly. To follow the discussion below you'll need a copy of the datasheets for the AVR CPUs, links to the various CPU datasheets are available from the Arduino website. In addition, as speed is of the essence I'm using direct pin I/O rather than using the Arduino digitalWrite() function, which is an order of magnitude slower than direct pin I/O. I'm also not going to delve too deeply into the intricacies of AVR SPI as there are other good tutorials that cover it, such as this one.

I need this code to work on both a Demilanove and a Mega so the first step is to define some macros to access the SPI pins. The SPI pins are predefined by the hardware, so we need to get them from the datasheet. The comments show the corresponding Arduino pin numbers.

// Duemilanove.
#if defined(__AVR_ATmega328P__)
#define SPI_DDR  DDRB
#define SPI_PIN  PINB
#define SPI_MOSI 3       // Arduino pin 11.
#define SPI_MISO 4       // Arduino pin 12.
#define SPI_SCK  5       // Arduino pin 13.
#define SPI_SSN  2       // Arduino pin 10.

// Mega.
#elif defined(__AVR_ATmega1280__)
#define SPI_DDR  DDRB
#define SPI_PIN  PINB
#define SPI_MOSI 2       // Arduino pin 51.
#define SPI_MISO 3       // Arduino pin 50.
#define SPI_SCK  1       // Arduino pin 52.
#define SPI_SSN  0       // Arduino pin 53.

The next step is to initialise all the pins and put them into a known state. Note that I'm setting up the MISO pin for completeness, even though the HL1606 strips don't actually output any data.

    // Initialise the SPI pins.
    BIT_HI(SPI_DDR, SPI_MOSI);  // Output.
    BIT_HI(SPI_DDR, SPI_MISO);  // Output.
    BIT_HI(SPI_DDR, SPI_SCK);   // Output.
    BIT_HI(SPI_DDR, SPI_SSN);   // Output.

    BIT_LO(SPI_PORT, SPI_SCK);  // Low,
    BIT_HI(SPI_PORT, SPI_SSN);  // High.

The setting of the hardware SPI is controlled by the SPCR register. That's defined for you by the avr-gcc environment along with the appropriate bit values, so we can just access it directly.

    // Initialise SPI.
    SPCR = _BV(SPE) | _BV(MSTR) | _BV(SPR0)
    SPSR = _BV(SPI2X);

The _BV macro maps a bit number (0..7) to the appropriate bitmask. SPE is the SPI enable bit, MSTR is the SPI Master mode bit, as the Arduino will be the bus master, and SPR0 in conjunction with SPI2X sets the hardware SPI to 1/8th of the system clock speed (16MHz), i.e. a SPI clock frequency of 2MHz. The HL1606 datasheet says its maximum SPI clock speed is 600ns which equates to a clock frequency of 1.66MHz which means I'm actually clocking the SPI interface faster than the stated maximum. However this appears to work fine on the short 20-LED segments I'm using, I suspect that for longer strips, clock skew between the HL1601s will probably make such high speeds unreliable.

The last step is to actually write some data to the SPI interface. To do this, we first pull the /SS pin low to select the strip, then write the data, then pull /SS high again.

#define BIT_HI(R, P) (R) |= _BV(P)
#define BIT_LO(R, P) (R) &= ~_BV(P)
void output(uint8_t *data, uint8_t len) {
    data += len - 1;
    for (; len > 0; len--) {
        SPDR = *data--;
        while (! (SPSR & _BV(SPIF))) {
            // Busy loop.

SPDR is the SPI data register, writing to it clocks the data out on the SPI bus, and reading from it gets any data that was put on the bus by the slave during the write operation - there isn't any this case. The while loop polls the SPI status register to wait for the write completion, which in this case will take 16 processor clock cycles. Note also that we write the data in reverse order as the LED strip is in effect a big shift buffer, so the last byte has to be sent first.

That sorts out writing the LED control bytes to the strip, but we still need to provide the fade clock to do the LED fade transitions. The google code library does this bit-banging the fade clock pin up and down which, as I explained earlier, isn't an option for my application. The AVR has a number of hardware timers, we can use one of them to provide the fade clock. Timer0 is used by the Arduino run-time for its timing needs, so the 16-bit Timer1 seems the best bet. The AVR timers are one of the more complex parts of the chip and have many different modes - see the datasheets for details. However for this application, the one we will use is the 'Fast PWM' mode. In this mode the timer counts up from zero to a specified number, toggling an output pin when it reaches the limit. The timer is then reset to zero and the cycle repeats. As before, we set up some macros for the relevant pins and initialise the timer.

// Duemilanove.
#if defined(__AVR_ATmega328P__)
#define FAD_DDR  DDRB
#define FAD_PIN  PINB
#define FAD_CLK  1      // Arduino pin 9.

// Mega.
#define FAD_DDR  DDRB
#define FAD_PIN  PINB
#define FAD_CLK  5      // Arduino pin 11.

    // Initialise the LED clock pin.
    BIT_HI(FAD_DDR, FAD_CLK);  // Output.
    BIT_LO(CLK_PORT, FAD_CLK); // Low.

    // Initialise timer 1 - fast PWM, use OCR1A, toggle OC1A, no interrupts.
    TCCR1A = _BV(COM1A0) | _BV(WGM11) | _BV(WGM10);
    TCCR1B = _BV(WGM13) | _BV(WGM12);
    TIMSK1 = 0x00;

To start the clock running we turn on the appropriate bits in the TCCR1B register, to stop it we clear them. That start and stops the clock square wave on the corresponding output pin.

#define PRESCALE (_BV(CS11) | _BV(CS10))  // Prescale by 64

    TCCR1B &= ~PRESCALE;    // Clock off.
    OCR1A = ticks;          // Number of ticks between each output pin toggle.
    TCCR1B |= PRESCALE;     // Clock on.

The last thing to mention is the selection of the prescaler value, and how to calculate the value of ticks. The HL1606 datasheet says that the maximum fade clock frequency is 200Hz. A little experimentation shows that we can overclock that as well, at least on short LED strips. The maximum rate is around 1KHz - beyond that you start to get glitches, dependent on the pattern being displayed - usually all the LEDs on the strip start flashing blue or white. We therefore need to come up with timer settings that allow us to generate a 1KHz or slower clock.

The timer is driven by the CPU clock with runs at 16MHz, or 62.5nsec per cycle. We need an up/down and down/up transition for each clock cycle, so that's two timer overflows per output clock cycle. The required calculation for a 1KHz fade clock is 1Khz / 2 / CPU clock rate / prescaler, where we get a choice of the prescaler value from 1, 8, 64, 256 or 1024. The best choice is a prescaler of 64 because that gives a nice whole number of timer ticks per KHz whilst giving us access to frequencies in the KHz range. 1KHz (1ms/tick) requires a OCR1A value of 125, and the maximum OCR1A value (65536) is approximately 2Hz (524msec). The HL1606 can fade between colours over either 63 or 127 ticks, which gives us a fastest fade speed of 1msec * 63 = 63msec and a slowest fade speed of 524 * 127 = 67 seconds which will be fine.

This isn't quite the end of the story. I added a second LED strip, using a /SS pin per strip to select the strip that I wanted to drive. That didn't work, with the second strip behaving in a most puzzling way. The next post in this series will describe how I diagnosed what was happening, and how I worked around the problem. Stay tuned :-)

Categories : Tech, AVR

Re: Driving the HL1606 using the Arduino's hardware support

Thank-you! That is exactly what I needed to know! I didn't understand the discussions people were having about the 'hackability' of this chip. I want to produce a sort of POV effect so I'll look for a LPD-6803 or D705 based solution for my glow staff. Nonetheless, the SPI information in your write-up is extremely useful and I will follow your discussions in eager. =)

Re: Driving the HL1606 using the Arduino's hardware support



I'm new to Arduino and the HL1606 and am about to experiment with it. I've compiled some code using the LEDStrip google project so I expect to get that working. However, I would rather use some of your code since it seems more specialized for the HL1606 and because you say it performs quicker.

I'm not looking for demo code, but something to address the HL1606 with, are you willing to share?


Re: Driving the HL1606 using the Arduino's hardware support

I've just released a library for high performance spi for driving leds on the arduino - check out FastSPI_LED.  It supports multiple led chipsets (hl1606, *595 shift registers, lpd2803, and a few more coming down the line), allows you to tune how much cpu usage it uses (and soon, will allow you to tweak prioritizing refresh rate or color levels).

I'm using it to do software based PWM of 40 hl1606 controlled leds, getting 32-48 levels of color with a ~60-65Hz refresh rate while only using ~65% of the arduino's cpu time - you can check out a video of it in action.

The hl1606 is really the bottom of the barrel as far as led controllers go, IMHO, though.  With straight up shift registers, I can push close to 1.2 million individual led on/off operations per second vs. the 375,000 ish that i seem to be capped at with the hl1606s.  Also - there are new controller chipsets coming down the line that are even better.

There's the lpd6803, which is used in these led pixels, where you can push down what color level (32 distinct levels) you want each of r,g, and b to be at and the chip does its own pwm based on you cycling the clock line.

Even better chipsets are available now than that.  There's the WSC2801 - which does 256 levels of color for each of r,g, and b and has its own onboard clock for doing pwm, so you don't even have to manage the clock line.  Then there's the CYT3005 which does 512 levels of color for each of r,g, and b.  Check out Ali Express to find leds using those chipsets.


Re: Driving the HL1606 using the Arduino's hardware support

I've written a pattern generation library that allows you to animate a given set of colours across the strips, does cross-fades between patterns and will drive multiple strips simultaneously,  Unlike all the other code I've seen it doesn't rely on delay() calls to do the timing or fading, it uses a task sheduler for the pattern generation and a hardware counter to do the fading, so you can generate multiple patterns simultaneously.  The patterns don't require you to write code, they are all done as data definitions,  Haven't decided if I'm going to release it yet :-)

Re: Driving the HL1606 using the Arduino's hardware support


Would be really great if you coud release your patterns!

pleeeeeeeeeeeease :-)

Keep on your great work


Re: Driving the HL1606 using the Arduino's hardware support

I'm intending to write a post on how I did the pattern generation as it's not immediately obvious why I implemented it the way I did.  Check back in a week or so :-)

Re: Driving the HL1606 using the Arduino's hardware support

 I should do a writeup of the pattern code i've got as well.  I've become very fond of generative patterns - ~20 lines of code giving me tens of thousands of patterns that have a very nice appearance to them - http://www.youtube.com/watch?v=dZ6_4VV3VGI - also, I saw a group recently that used png files to 'encode' patterns, allowing people to draw what they want the patterns to do then upload the "images".  

Playing around with the various stripes of leds (raw shift registers, hl1606, lpd6803, ws2801, and soon, tm1803s) has been a lot of fun.  I'm hoping to get some more cleanup on the led library and maybe some more sample pattern code out for more people to play with.

Re: Driving the HL1606 using the Arduino's hardware support

 Alan, are you familiar with TM1812 LED strips? I recently bought such one from http://www.bestlightingbuy.com/digital-tm1812-rgb-led-light-strip-160-leds-dc-12-volt-waterproof.html , can I write pattern code like others, such as the hl1606, or the lpd6803?