Let's start with a rather simple thing: the tool used to interface with the chip's one-wire DBG pin. It's nothing more than an autobauding half-duplex asynchronous port, in CMOS voltage levels. Such simplicity was reflected in Zilog's first debug "tool" sold with the early Z8F6401 development kits. It was a small board (approx 1 sq in), with a TTL-to-RS232 converter chip (MAX232, IIRC), a diode standing in for an open-drain driver, the DBG line pull-up, and some decoupling and charge pump capacitors, and two connectors -- one for the 6-pin target header, another for the serial cable.
Soon thereafter it became obvious that RS232 serial ports are being phased out from newly made PCs. In keeping with the times, Zilog switched to a USB-based "smart cable". This came in a plastic enclosure, and had a Zilog MCU, a USB interface chip, and assorted other circuitry. Naturally, it was undocumented, and to use it you had two choices: use the driver and DLLs that Zilog furnished, or reverse-engineer the protocol. The latter was not very attractive since the replacement was almost a no-brainer: a USB-to-serial converter, and the simple serial-to-DBG interface.
Some of you will now think: well, wait a moment, didn't the "smart" cable provide some extra functionality, compared to the crude serial-port-based interface? Oh yes, it did provide extra functionality: it could also drive the target's RESET# line. That's about it. Never mind that the DTR line could be used for the same purpose. Of course the original "dumb" cable didn't have the DTR-to-RESET# connection. I can almost visualize the development tools division manager in a meeting with Encore! line strategist and upper management: we will make a new "smart" cable provide the RESET# signal to the newfinagled 8-pin XP series MCUs, and besides it will make things much faster.
In less than a decade, Zilog has managed to put out at least four versions of the DBG-to-PC interface: the "dumb" one using MAX232, two USB smart cables, and an Ethernet smart cable. Timing full-chip program loads using the DLLs and drivers provided with most recent version of ZDS II, the "dumb" interface wins hands down, providing 25-50% speedup on programming, and snappier behavior during debugging sessions. So, a lot of engineering effort for naught. Note that the "dumb" solution can be trivially ported to USB, and even isolated, all at a rather minimal cost, even if one would like to have an option of powering the target from the USB.
In that same decade, Zilog managed to change the logo stamped on their Encore! chips no less than three times. When I get back from my travel, I have to snap some pictures, as it is somewhat entertaining. I have a cache of single pieces from various Encore! lots.
In the same decade, they managed to keep shipping a free, but admittedly rather botched C compiler/IDE combo, called Zilog Developer Studio II. During this period of time, the rather obscene code generation bugs have been left alone, while one quite useful feature was dropped, other useful features added, and generally the development progressed at a glacial pace when compared with say gcc. The ZDS II woes almost warrant another post, but I'll much rather let the sleeping dogs lie.
Some straight priorities of Zilog, if you ask me.
I will finish with rather technical look at the real (vs. Zilog-imagined) needs of a "smart" cable for Zilog's DBG pin protocol.
As it turns out, whatever "smarts" the smart cable had were wholly unnecessary. You see, Zilog's chip design folks have very thoughtfully made the half-duplex DBG pin protocol inherently streaming-friendly. Save for the oddball whole-ROM CRC calculation, every command is executed in real time and requires no pacing/waiting. The reply bytes closely follow the command bytes. By "closely" I mean a delay of a couple of system clock cycles. As expected, a full erase-upload-verify cycle on a Z8F4821, done via the FT232R USB-to-serial interface, takes a whopping couple of seconds, when done at ~150kbaud (that's 15k bytes per second using 8-N-1 format).
One thinks, of course: is it possible to really speed it up any, and would placing a CPU between the USB-to-serial chip and the DBG line really help? Yes, somewhat. Let's assume we're using FTDI's interface chips, like FT232R. Those have a small, 384 byte buffer. Since the OS can sometimes starve the USB devices of USB read transfers, it could help to have an extra layer of buffering between the stream reflected from DBG pin, and FT232R, obviously with RTS/CTS handshaking enabled.
The USB transfers are paced with a 1ms USB frame period. This means that a turnaround from the PC to the target and back is no shorter than ~1ms; in practice it is 3ms since the FTDI chip, faced with no further activity on the input, will purge the receive buffer after an extra delay of 2ms.
First the PC sends a read command to the target, the half-duplex interface reflects the command back, and the target appends results of the read. Before subsequent commands can be sent, the PC must receive the results.
This can be worked around by providing a method of pacing the transmission so that the reply will have a place to "fit in". At "slow" baud rates, such pacing should require no extra effort: we send 0xFF (all data bits set), and the target pulls some of the bits low with its open drain driver. I haven't checked if the target's contention-detection circuit gets tripped by that, though. At "fast" baud rates, where we could reasonably expect the skew in target's reply bitstream to be significant enough to corrupt the bitstream, this of course becomes problematic.
This can be "fixed" by inserting an MCU between the USB-to-serial interface, and the target. The MCU would perform only two functions:
- buffering the data when the host keeps RTS# deasserted, to prevent data loss due to overruns,
- inserting a "sense" delay between bytes coming from the host being equal to 0xFF, and other bytes.
Moreover, any host software can easily accommodate this functionality being absent by either enforcing a roundtrip latency -- following an FT_Write() by FT_Read(), or by selecting a lower baud rate (I'll have to check that!).