Apple's 3D Touch has enabled a new dimension in touchscreen interaction by detecting pressure levels, but Microsoft is betting on detecting gestures even if you don't touch the screen.
True, Samsung and Sony have already launched some devices that can detect when you hover your finger over the screen (Air View and Floating Touch), but Microsoft's pre-touch touchscreen seem to have far greater accuracy, allowing the device to sense how far things are from the screen.
This isn't exactly a pressure detection replacement but more of a complement, and it ends up beign better suited for a wide range of interface enhancements that aren't possible with pressure sensing. For instance, interface buttons might show up just as your fingers get near the screen, requiring no actual touch for them to pop up as they do today; not to mention a wider range of multitouch gestures.
The technology seems to be ready, now all we need is actual devices using it. Maybe for next year's Surface phones?