The uncertainty principle is at the foundation of quantum mechanics: You can measure a particle's position or its velocity, but not both. Now it seems that quantum computer memory could let us violate this rule.
The theoretical underpinnings of the uncertainty principle are, like most things to do with quantum mechanics, extremely difficult to follow and require a minimum of six degrees to really understand, but the great physicist Paul Dirac provided a more concrete illustration of what the uncertainty principle means. He explained that one of the very, very few ways to measure a particle's position is to hit it with a photon and then chart where the photon lands on a detector. That gives you the particle's position, yes, but it's also fundamentally changed its velocity, and the only way to learn that would consequently alter its position.
Now, technically speaking, the uncertainty principle doesn't forbid you from measuring both the position and the velocity of a subatomic particle - it merely prevents you from measuring both with any great precision. It's possible to get a rough idea of both or a highly accurate measure of one, but those are your only options. So you could weaken the photon burst so that the particle's velocity was less affected, but this would give you a fuzzier sense of its position and still change its position, if to a smaller degree than if you set out to measure its position exactly.
That's more or less been the status quo of quantum mechanics since Werner Heisenberg first published his theories in 1927, and no attempts to overturn it - including multiple by Albert Einstein himself - proved successful. But now five physicists from Germany, Switzerland, and Canada hope to succeed where the father of relativity failed. If they're successful, it will be because of something that wasn't even theorized until decades after Einstein's death: quantum computers.
Key to quantum computers are qubits, the individual units of quantum memory. A particle would need to be entangled with a quantum memory large enough to hold all its possible states and degrees of freedom. Then, the particle would be separated and one of its features measured. If, say, its position was measured, then the researcher would tell the keeper of the quantum memory to measure its velocity.
Because the uncertainty principle wouldn't extend from the particle to the memory, it wouldn't prevent the keeper from measuring this second figure, allowing for exact (or possibly, for obscure mathematical reasons, almost exact) measurements of both figures in flagrant disregard of Heisenberg's principle. If this wouldn't destroy uncertainty completely, at the very least it would fundamentally alter our understanding of quantum mechanics and particle physics. (It might even reopen the possibility of that interstellar ansible, but you didn't hear that from me.)
The mathematics of all this appears to be sound, but we're still a long way from testing it in the laboratory. It would take lots of qubits - far more than the dozen or so we've so far been able to generate at any one time - to entangle all that quantum information from a particle, and the task of entangling so many qubits together would be extremely fragile and tricky. Not impossibly tricky, mind you, but still way beyond what we can do now. Quantum computers better be ready the day they come online, because we've got one hell of a to-do list waiting for them.