top | item 24704298

USB3: Why it's a bit harder than USB2

237 points| panic | 5 years ago |lab.ktemkin.com | reply

154 comments

order
[+] jpm_sd|5 years ago|reply
I recently learned that USB3 is not only badly designed, flaky and unreliable, it is also an EMI/RFI nightmare. This is woefully understated in the article when it says:

"It's hard not to generate harmful interference."

We built a prototype sensor payload that included a USB3 external hard drive. Started suffering broad spectrum interference that stomped all over L-band (Iridium and GPS) reception. Made the entire system unusable!

Some references on USB3 noise:

USB 3.0 Radio Frequency Interference Impact on 2.4 GHz Wireless Devices (by Intel) https://www.usb.org/sites/default/files/327216.pdf

see especially Figure 2-2: "the data spectrum is very broadband, ranging from DC to 5 GHz"

USB 3.0 Interference - Cradlepoint Knowledgebase https://customer.cradlepoint.com/s/article/NCOS-USB-3-0-Inte...

"USB 3.0, or SuperSpeed USB, uses broadband signaling that can interfere with cellular and 2.4GHz WIFI signaling. This interference can significantly degrade cellular and 2.4GHz WIFI performance. Customers using cellular networks or 2.4GHz WIFI networks near USB 3.0 devices should take measures to reduce the impact of these devices on their network connectivity. Please note that interference is generated by both the actual USB 3.0 device as well as its cable."

[+] baybal2|5 years ago|reply
From my own experience, it is very hard to engineer a device that can run WiFi full speed along with USB3, especially something like a smartphone, or tablet, where USB3 pins come out right out of the SoC.

In fact, we tested existing laptops on the market for WiFi/USB3 coexistence, and only 1 laptop ever made it flawlessly. It was a quite old Sony Vaio, where USB3 lanes were physically put under an RF shield from controller to the port.

[+] vlovich123|5 years ago|reply
Is this a uniquely USB3.0 problem? I've been in the consumer electronics spaces for 10 years & from the very beginning I would hear reports about problems for 2.4Ghz WiFi any time you were transferring data on USB (& usually EEs try to tackle this with shielding if USB + WiFi coex is important). I have not heard of this as uniquely new to USB3.

I have never heard of interference with Iridium & GPS for USB2/3 and I worked on a team that was responsible for GPS at Apple, but it's entirely possible there's shielding needing to account for this & I just wasn't closely involved in it. If this were actually an unsolvable problem though, I would expect CarPlay & Android Auto would have a problem when you use your phone for navigation & start playing audio through USB. Maybe that's not enough traffic frequently enough to generate the noise needed to make it unusable, maybe the EE problem you were having was different, or maybe there's just good shielding in phones to avoid this as a problem.

[+] ecopoesis|5 years ago|reply
It's not just USB 3, Thunderbolt 3 also has an interference problem. I had to get a 1 foot extension cable for my Logitech Unifying receiver plugged into a CalDigit Thunderbolt 3 dock to prevent my mouse from freezing. My wife had to do the same with her Unifying receiver plugged into a LG 5k Thunderbolt monitor.
[+] tarruda|5 years ago|reply
For years now I've used a logitech wireless dongle paired to mouse+keyboard.

During all those years the only way I could avoid random drops is by connecting the dongle to usb 2.0 ports.

I've had this issue both on a old dell l502x laptop and also now on my current asrock x370 motherboard which doesn't even have native usb 2.0 ports.

Sad to know this is actually a design issue, and thus may repeat on any computer I acquire in the future.

[+] charwalker|5 years ago|reply
I knew it! I had a 'gaming' type very comfortably shaped mouse but it used a specific 2.4ghz adapter. Any time I did a large file transfer over USB3.0, any flash drive replicated the issue, the mouse would become jumpy and unusable plus WiFi speeds dropped to maybe 10% for the duration of the transfer. I ended up giving the mouse to a friend and going back to my first gen MX Master.
[+] rehevkor5|5 years ago|reply
Also, if the mentioned "key" used on each end is well-known or the xor approach is not cyptographic in quality, then anyone could listen to this EM output and compromise your "hard wired" data which you'd (forgivably) expect to be secure, a la TEMPEST. Depending on the strength of the signal, could be a serious problem.
[+] seiferteric|5 years ago|reply
Way back when, I thought I remembered that usb3 or thunderbolt was going to be an optical interface, but I guess that was too tough in practice. Would have been nice though to avoid EM interference issues.
[+] rimliu|5 years ago|reply
Experienced this first hand. Had a crappy WiFi connection at home, started to investigate and found out that signal qulity degraded a lot when my Seagate SDD was connected on USB3.
[+] robotnikman|5 years ago|reply
I had to wrap a USB 3.0 cable in tin foil once when I realized it was causing the issues I was having with wifi
[+] usr1106|5 years ago|reply
We are using Intel Realsense cameras in our product. Sometimes the USB connection fails completely, sometimes it is USB2 only and the camera cannot work. Sometimes a reboot helps, but sometimes power cycling is required. This is without plugging the cable, plugging adds its own unreliability. From reading the product support forums we are not alone.

As a SW engineer my interpretation has always been that USB3 speeds are just too high to work really reliably with consumer grade hardware. This article gives technical details that tell me I have been correct.

[+] colechristensen|5 years ago|reply
> As a SW engineer my interpretation has always been that USB3 speeds are just too high to work really reliably with consumer grade hardware. This article gives technical details that tell me I have been correct.

The last section of the article is the most relevant: the hardware, tools, and documentation for USB3 are of mediocre quality. The whole history of moving bits over cables is adding and adapting tricks for getting more and more data through, we’ve always been at the point where poorly implemented solutions would cause problems, and we’re several orders of magnitude away from it being possible for a novice to implement a hardware and software solution from scratch (bit banging a serial interface on a microcontroller GPIO is possible for a reasonable person to do in a week to achieve less than a megabit connection). Super fast data rates are all over and most of them are very reliable, USB isn’t up to the same quality. You can make any speed reliable and fool proof, you just have to do it.

[+] ladberg|5 years ago|reply
I've had nearly the same experience! I was working on a project for a robotics class and my team could not figure out why our object tracking went from 30fps to ~5fps seemingly randomly. It turns out the USB3 cable wasn't fully plugged in and couldn't make full contact so it was running on USB2 and didn't have the bandwidth to send the video at 30fps.

It took a while to debug because I was absolutely convinced the slowdown was a software or GPU issue. After all, wouldn't you expect most digital ports to be binary: either be plugged in or not? It turns out with USB3 there's a third state.

[+] MrBuddyCasino|5 years ago|reply
Apple/Intel Thunderbolt does not have this problem in my experience. USB has always been the cheapest alternative (remember Firewire?). Add to that the sheer incompetence of the USB committee (see the USB PD spec, or USB Audio) and its a miracle anything works at all.
[+] raxxorrax|5 years ago|reply
This has also been a problem with USB2 cameras. The signal rates reach physical limits. A 5m cable is probably too long for high speed applications, most camera vendors say that they support only 3m. But if you have a camera in any rugged device, even a bit of EM (electro magnetism) can ruin your connection.
[+] tinus_hn|5 years ago|reply
If it starts working again after a reboot it’s a problem that can be solved in software. It’s just hard so consumers instead have been trained to accept unreliability.
[+] echelon|5 years ago|reply
> We are using Intel Realsense cameras in our product.

That's an interesting domain. What are you working on? Any reason for choosing Realsense over Azure Kinect or other sensors?

[+] rubicks|5 years ago|reply
I can always rely on our Intel Realsense cameras -- to magically transform themselves into USB 2.1 devices after a couple hours' use.
[+] swiley|5 years ago|reply
IMO: It's not the speed that's the problem, lots of consumer hardware has very fast networks/buses like ethernet. It's the huge number of states and the communication required to negotiate them.
[+] bleepblorp|5 years ago|reply
I wish the industry would bite the bullet and come up with 'USB-hypothetical' based around a strictly enforced standard which used active optical cables with additional conductors for power.

An awful lot of USB issues would go away by getting rid of the need to push an electrical protocol too close to physical limits, enforcing sane minimum standards so end users have a clear idea of what a given port can provide (none of the current mess where 'USB-C' represents a connector that could support any number of protocols), and making layer 1 a dumb pipe (no special wires needed to support optional feature X) so functionality could be entirely software-defined.

USB is just too complicated for its own good, and has too many optional features, to be friendly to end users.

[+] baybal2|5 years ago|reply
USB certification already exists, and consistently fails regardless of how hard, and arduous it is to pass, and the only option seem to make it actually work is to make it even more strict, which itself will be a turnoff to hardware, and more importantly, chipmakers.

Thunderbolt chips already cost an arm, and a leg.

[+] xorfish|5 years ago|reply
Optical cables are not robust enough for the use as everyday cable.
[+] ben509|5 years ago|reply
> To reduce the amount of harmful interference generated, USB3 links use a technique called scrambling, in which data is XOR'd with a fixed pattern before transmission.

So I can take that pattern, repeat it a few million times, and then if someone transmits it (as simple as saving it to disk), it turns their USB devices into transmitters.

[+] jleahy|5 years ago|reply
Correct, and these attacks exist and will cause a link to go down. It’s a problem that was considered in the design of both 10G ethernet and PCIe which both use similar mechanisms.

Generally a key defence is the maximum packet size (you can’t keep your evil stream aligned with the scrambler) along with the huge pattern length (for PCIe g1/2 and USB3, 10GE and PCIe g3 works differently) which means that the amount of data you’d have to send is just too large.

[+] willis936|5 years ago|reply
Only if certain other conditions are met.

The PCB routing and cable do not have to not be tightly coupled differential pairs (this is almost never seen in practice).

The data you’re trying to transmit falls within the line code. Keep in mind only 1/4 of all possible bit sequences are even available in 8b/10b.

[+] m463|5 years ago|reply
sounds sort of like grey codes in a different domain.
[+] fogihujy|5 years ago|reply
I keep ending up with USB3 equipment connected through a USB2 port because it won't work in USB3 ports. It's not often, but common enough to be annoying.

An interesting question is if the USB2/micro-USB connector era was peak USB and that we're going to see more varied and differing implementations in the future.

[+] wjdp|5 years ago|reply
Getting an Oculus CV1 VR headset working via my mobo's USB was fraught with issues (4 things tyring to run at USB3 speeds). Eventually gave up and bought a known good PCI card.

Though I think this was more to do with controller bandwidth than the physical layer.

[+] yetihehe|5 years ago|reply
I've connected popular bluepill board to usb3 on my computer, effect - device works, but is damaged in such a way that it draws additional 200mA (even without usb connection) and uses it for heating. Fixing it requires soldering new chip.
[+] rkagerer|5 years ago|reply
I gave up trying to get my HTC Vive Pro camera to work over USB3.
[+] etaioinshrdlu|5 years ago|reply
A lot of these issues remind me of twisted pair Ethernet. It's interesting that Ethernet is usually coupled with transformers, but USB3 is typically done with capacitors. Can someone more knowledgable comment on the relative difficulties and problem points in 10Gbe vs. USB3?
[+] kbumsik|5 years ago|reply
10GBASE-T has a different story to USB3.

10GBASE-T consumes too much power (2-5 Watts!) and generates too much heat for a single port (a transceiver can go up to 90 celsius!) , so it is not suitable for most of areas, including for home NICs (requires spaces for big coolers) and for enterprise switches (ports cannot be densely placed [1]). But I haven't heard of reliability concerns that USB3 has. Well, they are designed to work up to 100 meters so they should be reliable. Maybe it's all because of the transformers you mentioned?

Therefore it's very common to use alternatives like optics and simple DAC copper cables for 10Gbe. I also think home networking industry will eventually give up 10GBASE-T.

[1] https://wiki.mikrotik.com/wiki/S%2BRJ10_general_guidance#Gen...

[+] formerly_proven|5 years ago|reply
10GBASE-T uses four pairs full-duplex for 10 GBit/s total BW per direction. USB 3.0 uses one pair per direction for 5 GBit/s. The analog bandwidth of USB 3.0 extends to 3+ GHz. 10GBASE-T only goes up to ~400 MHz or so.

Running at much lower frequencies makes it reach much farther, makes it more immune to EMI and makes it more reliable (because the margins are much fatter when you are only using 10 % of the possible minimum cable length).

[+] RL_Quine|5 years ago|reply
Capacitors are cheaper and smaller, and you don't have runs of USB long enough to really need proper galvanic isolation.
[+] MisterTea|5 years ago|reply
USB 3 and the idiotic USB C standard can die in a fire.

Replace it with a stripped down version of Ethernet with a simple, sane protocol that can encapsulate ethernet frames, display port frames and PCIe frames. Something like firewire where we can memory map a device or perform bulk transfers or byte oriented transfers.

Maybe a "converged" controller with some brains can allow it to handle Ethernet frames directly so an Ethernet port or dongle isn't much more than an external MAC. Then we can use existing Ethernet interface hardware, MAC's, jacks and so on. A new connector would be nice too. One that doesn't have the ability to inject 20V into a 5V device. Real-time can be handled using TSN and PTP can also be used.

Finally, combine that with single pair Ethernet for local desktop device networking at up to 1GB with PoE over a single pair of wires. Why do we need USB 3 and C again?

[+] labawi|5 years ago|reply
> Why do we need USB 3 and C again?

Typical copper ethernet is 1Gb/s, while USB 3 is 5 Gb/s, going on 10/20/40 Gb/s with USB 3.2, USB 4 .. thunderbolt.

As others have said, 10G ethernet is too power hungry, and even 1G ethernet is inherently more expensive. I resent active cables and do prefer ethernet, but it's not a realistic proposition for most use cases.

[+] baybal2|5 years ago|reply
Not just "bit" harder, but much, much.

I constantly see "certified" USB3 gear failing.

[+] neltnerb|5 years ago|reply
Keep needing to explain this one to clients... sure, I think I can get it to work... it'll be a lot trickier.

HDMI though, the routing examples for that one are stunning.

[+] pfundstein|5 years ago|reply
Doesn't HDMI also have dedicated pins for Ethernet?
[+] kbumsik|5 years ago|reply
Does USB4/Thunderbolt will change the story? I heard their signaling is quite different than USB3.
[+] baybal2|5 years ago|reply
Yes it will, making it even more complicated.

Signal integrity minimums for thunderbolt is more strict than USB3, and it is a bigger EMI issue, but at least the new standard mandates mandatory shielding.

[+] gxx|5 years ago|reply
Do these problems also apply to USB C?
[+] droopyEyelids|5 years ago|reply
USB C is the shape of the end connector, USB 3 is the protocol.

So yes, you can have USB 3 on USB C or older shapes of USB cables.

[+] villgax|5 years ago|reply
Whats harder is getting mobile phones to work as a damn usb webcam without paying money for it.
[+] brian_herman|5 years ago|reply
I bet USB 4 is like calculus compared to these.