Jurek and the Amazing Techno, Colored DreamWall

Jurek and the Amazing Techno, Colored DreamWall

The title is mostly a placeholder, as I haven't really figured out a name for it yet. This project is a wall hanging that consists of semi-large triangular pixels using discrete RGB LED's and PWM to control intensity levels of each LED, resulting in a 4096-color display.

Tuesday, October 10, 2006

Wow, I've had a productive (and depressing and exhausting) last 3 days. This post is long, so please bear with me. It is entirely about coding, so I don't blame you if your eyes gloss over and you stop reading it because of boredom.

I set out on Sunday to try to get the ADC and distance sensor working correctly. The distance sensor was fairly easy to hook up and seemed to work fine. 0.4v to 2.1v, depending on how close something was (higher = closer). I wasn't really worried about that, however.
The ADC was of prime concern, and it seems that my concern wasn't without merrit. I was eventually able to get the code to write to the ADC, as well as read at least 1 byte from the ADC (all my TWI programming thus far has been purely writing). I only think I am able to read a single byte, as I'll explain in a bit.
Try as I might, I wasn't able to get anything usable from the ADC off the TWI bus. I could only seem to get the first 4 bits. It's a 12-bit ADC and thus requires 2 read cycles on the TWI to get all 12 bits. The first 4 bits are the MSB and generally the most important. The final 8 bits in the 2nd read cycle are the LSB's and still important. It seems like the only thing I have been able to get from it was all 1's (not promising, it means the ADC is flatlined somehow).

After a couple hours of poring over whitepapers and fiddling with code and a multimeter, I eventually found some bad news. The power being supplied to the ADC's is about 4.1v. The nominal voltage listed in the datasheet is 2.7v-3.6v (I assumed my voltage regulator would correctly drop the 5v coming into the board down to 3.3v). However, I failed to read the part that said that if the voltage is above 2.7v (NOTE: "above 2.7v" doesn't mean "above the nominal voltage range"), then the REF pin needs to be tied to VDD instead of ground, as it shows in the handy little wiring diagram (which is the way I have it wired up on the board).
So it looks like the ADC's on the board are bunk. I think they are functioning fine, but they don't have the correct voltage reference to properly do the ADC conversion.
It looks like I'll have to wait for version 1.1 of the microcontroller board to do any sensors. :( Boo-hoo! (unless I can rig up some other board with different parts).

My next major hurdle was something that has been lingering with the code since day 1. I have been using Keil's uVision development suite for the 8051 microprocessor. The starter kits came with an evaluation version of this software, which is fully functional except that you can't compile code larger than 2k bytes. I had been hovering at just under 2000 bytes up until I started testing the ADC, but then noticed that I had to start removing unused variables because I was hitting the limit. I looked into pricing for the software, but was immediately floored at the $1700 pricetag. I found a newer version of the software (as an evaluation version as well) and inquired about whether or not they had a "light" or "hobbyist" version. Of course, no go there. $1700. Take it or leave it. Well, I certainly chose the latter option.

After some quick searching, I found many different options for 8051 compilers, both freeware, cheap, and full-priced (with evaluation versions). I tried them all. Some several times. Some of them had really horrible C compiler support ("bit" wasn't a typedef, incomplete SFR's (Special Function Register), etc). Some I just couldn't get to work (horrible UI, DOS-based IDE???, goofy compiling steps).
My most-promising candidate was SDCC. It's an open-source retargetable compiler for various microprocessors. It supports Intel 8051, Zilog Z80 (TI-85 I think and GameBoy), Motorola 68HC08 (precursor to the 68HC12 line), and Microchip PIC16 & PIC18 lines. It was able to compile the uVision code basically out of the box (I had to change a few include files, which is understandable).
Unfortunately, the first time I tried programming the hex file onto the uC, it failed. After a brief visit to the documentation, I found that the default format, .ihx, isn't compatible with most programmers and must be converted to .hex with an included converter. This I tried and again, it failed! Back and forth I went, comparing uVision .hex to SDCC .hex and trying various compiler options to no avail.
I did some searching on their message boards this morning and eventually found a post referring someone to search on 8052.com for help. Then I found this post:
Atmel has been shipping boguous header files for several "non Keil" C compilers for some years now. You're lucky if IAR throws out warnings/errors.
(The header files Atmel ships for SDCC are boguous too but the SFR definitions look like variable definitions to SDCC so you get no warning.)
So I downloaded the header file they provided (Thank the Gods!) and after some quick modifying of the SFR header, I was able to get it to compile and run in both uVision and SDCC!
Yay, the shackles of a 2k codespace have been broken!
However, with one very minor caveat, the SDCC compiled code is about 50% the speed of the uVision code. Not a big deal currently, and I don't think it will impact me too much. I have yet to dig into compiler optimizations, so there may be some speedup yet to be had.

And that is how I spent my summer vacation...



Post a Comment

Links to this post:

Create a Link

<< Home