Check out the new USENIX Web site. next up previous
Next: Conclusions Up: Embedding Linux to Track Previous: Serial Port


DSP development

Low cost single board and single chip computers offer fast hardware I/O with chaining DMA and signal processing features such as multiply-accumulate instructions. Often, these units can be purchased with Linux pre-installed.

At the time that the electronics were designed, several years ago, these options were not available at reasonable cost. Therefore, a dedicated small signal processing chip was designed into the electronics. The Analog Devices 21xx series processor has two synchronous serial interfaces, one of which communicates with the sensor electronics and the other with the host processor.

A team containing several programmers developed the code in parallel by segregating the project along functional lines. The Windows-based development tools would get confused when importing changes made by another programmer, such as when a shared header file needed to be modified. Some code loss had occurred during a previous project, so this project utilized the DOS-based tool chain from the vendor. This tool chain made no assumptions about the state of the underlying source files and relied on file timestamps to control the build process, thereby increasing reliability.

Each developer used an account on a Linux computer with samba services to expose their directory to any desktop computer. The tool chain would execute in the directory of the individual's SMB share, resulting in a self-contained development environment. Each user would log into the linux machine to run cvs or take advantage of the more flexible search and indexing tools available at the unix command line.

This allowed rapid parallel development with no code loss. The Network Time Protocol (NTP) was used to synchronize the clocks of the various computers, but this was unreliable on Windows98 computers because they tended to believe whatever any of the other network servers had most recently told them. Occasionally, this resulted in build errors when cvs timestamps were slightly misaligned, but not sufficiently for make to provide a warning.

The download and diagnostic software was implemented on linux machines, using serial and parallel port interfaces. The DOS tools were installed in a virtual hard disk image under dosemu and embedded into a Linux makefile, so that all developers could use a single identical execution environment for the tool chain from any computer. In addition to allowing the developers to telecommute, this permitted them to make code changes from whichever computer happened to be closest to the hardware's location.

Thus, typing make test can rebuild the DSP code, download and verify it, reboot the DSP, check it wakes up, then run low level regression tests for obvious errors. It then checks the source tree for the protocol drivers in the host, reinstalling them if necessary, before running a high level functional test that verifies reliable acquisition. This kind of cross-tool automation in a `make' environment is trivial under Unix. Since toolchains are designed by their vendors to be completely self-contained under Windows, this kind of easy project integration is generally impossible.

A single Linux computer exported the directories to the network, ran the tools under the DOS emulator and streamed data to and from local hardware test stations. The Pentium 75 system supported three simultaneous users with $16\,MB$ of RAM and $3\,GB$ of disk space. When two additional users were added, one of which was using octave and gnuplot for analyzing test results, the memory was doubled to $32\,MB$.


next up previous
Next: Conclusions Up: Embedding Linux to Track Previous: Serial Port
alex.perry@ieee.org