Revisiting a Half-Duplex Tether


So today I was pondering a tether based upon Ethernet-over-Powerline (HomePlug), and went back into the forum to see what had been previously posted about it. Starting with this thread:

I got distracted by the sub-thread on half-duplex Ethernet. A couple of hours of research later, I decided that it might just work.

Half-duplex operation dates back to the earliest days of Ethernet, when it used coaxial cable and had multiple stations talking over the same cable. Each individual station used a process called CSMA/CD (eh, you can Google it), to make sure that the various stations did not stomp on top of each other. The benefit to this was that a single coaxial cable could carry bidirectional data between multiple stations. The drawback was that it severely limited the practical data rate on the cable. Typical figures seem to be 30-40% of the peak transmission rate.

While 30% of a 10Mbit/sec link would not be useful for OpenROV, if we could get 30% of a 100 MBit/sec link it would work fine for sending HD video. While half-duplex operation is normally associated with 10Mbit/sec ethernet flavors (specifically 10Base-5 and 10Base-2), it turns out that the Ethernet spec, back in its dusty corners, still supports half-duplex operation at 100 Mbit/sec. In fact, if you look at the auto-negotiation process:

you can see that 100 Mbit/sec half-duplex operation is specifically tested for.

So in the original discussion on half-duplex mode that is linked above, Eric listed two issues for it:

1.) Receiver on node hears its own transmissions and gets confused

2.) Paralleling the transmit and receive channels messes up cable termination (impedance matching)

We'll start with #2, since it's a solvable problem.

Sure enough, when you parallel the transmit and receive ports of an Ethernet connection, the impedance looking into that connection now is 50 ohms rather than the desired 100 ohms, and you're going to have a lot of problems with ringing on the cable. The solution to this is to change the termination of each channel on the BeagleBone from 100 ohms to 200 ohms. This is done by changing 4 resistors (R129, R130, R133, R134) from 50 ohms to 100 ohms. This changes the port termination to 200 ohms. For a signal arriving at the Beaglebone, it sees two 200 ohm terminations in parallel, which looks like 100 ohms. The transmit signal is a bit more complex. The transmit driver on an Ethernet PHY is a current (rather than voltage) driver that is designed to create the correct voltage level when driving into 50 ohms. It normally sees the 100 ohm near-end termination in parallel with the 100 ohm cable, for 50 ohms. For our modified device, the transmitter sees a 200 ohm near-end termination, in parallel with the 100 ohm cable, in parallel again with the 200 ohm receiver termination. Mash all these together and it looks like 50 ohms, which is exactly what the transmitter wants to see. Here is what it looks like schematically:

So at the cost of changing out 4 resistors on the BeagleBone, we save the cost of the balun, plus avoid the 3 dB+ power loss of the balun.

I'm assuming nobody's going to want to do this to their laptop, however.

So what about problem #1, that the receiver will get confused?

Well, this all depends upon how the receiver on the BeagleBone implements collision detection. Here's the datasheet for the Ethernet PHY on the Beaglebone:

Sections 3.8.7 refers to collision detection, and section talks about half-duplex operation in 10Mbit/sec mode. Unfortunately there are no real details to talk about how collision detection specifically works when the chip is in 100 Mbit/sec half-duplex mode.

There's also the wrinkle of an interesting comment on Sheet 8 of the Rev A6A of the BeagleBone schematic, which suggests that there might be something wrong with the pin connections between the processor and the PHY regarding collision detecting (COL) and carrier sense (CRS).

Urg. I think what it's really going to take is talking to a good apps engineer at SMSC to see whether collision detection works properly at 100 Mbit/sec in this application. Alternatively, one could modify a BeagleBone and just give it a go. Since I've only got one BeagleBone right now, I'm not quite ready to make that leap.




Once again- AWSOME!

I'm still a bit smokey on the electrical engineering, but would it be possible to make a circuit that matches impedance on the other side (outside) of the isolation inductors on the BeagleBone? If this were possible, it could also be used computer-side and with other Ethernet blocks. Also, couldn't issue#1 (the port hearing itself) also be fixed with the same electronics by adding a circuit that subtracts its own signal from the line? Even though this would be half-duplex, such a circuit would just make the receiver hear nothing when a signal is coming from the same port.

It seems that the circuit we arrive at after considering all of this is once again a hybrid coil!

It still seems to me that either an amplified hybrid circuit or a chipset that uses a special protocol (e.g. IEEE 1901) would still be the best way to go.

Man, what a challenge!



Yep, what you've just described is a hybrid. I've got some parts on order for an amplifier, so I've just been doing some brainstorming while waiting for them to arrive.

In the long run something like IEEE 1901 will likely be the best solution. I've got some stuff to post on that, plus some more thoughts about half-duplex mode. I'll try to get them both out the door tomorrow unless I get distracted by other things like work.