That's probably not a surprise to you, but a huge piece from The New York Times this week illustrates just how audacious Uber really is when it comes to playing by the rules:
Uber engineers assigned a persistent identity to iPhones with a small piece of code, a practice called “fingerprinting.” Uber could then identify an iPhone and prevent itself from being fooled even after the device was erased of its contents.
There was one problem: Fingerprinting iPhones broke Apple’s rules. Mr. Cook believed that wiping an iPhone should ensure that no trace of the owner’s identity remained on the device.
So Mr. Kalanick told his engineers to “geofence” Apple’s headquarters in Cupertino, Calif., a way to digitally identify people reviewing Uber’s software in a specific location. Uber would then obfuscate its code for people within that geofenced area, essentially drawing a digital lasso around those it wanted to keep in the dark. Apple employees at its headquarters were unable to see Uber’s fingerprinting.
The ruse did not last. Apple engineers outside of Cupertino caught on to Uber’s methods, prompting Mr. Cook to call Mr. Kalanick to his office.
Apple banned using "device fingerprinting" which is a method of tracking your iPhone or iPad through deletions of an app, in iOS 9 and even earlier with UDID tracking.
Uber was purposefully breaking Apple's rules, like many others, by doing this — and going to extreme lengths to hide it from Apple in the review process. What's curious, however, is that Apple gave Uber only a dressing-down rather than kicking them out instead; a move that would decimate Uber overnight.
The company claims the practice was to "prevent fraud" but given the other things it's been caught red-handed doing in the past, it's hard to believe that's the only motive behind anything it does.
The New York Times highlighted one other creepy tidbit worth noting:
Uber devoted teams to so-called competitive intelligence, purchasing data from an analytics service called Slice Intelligence. Using an email digest service it owns named Unroll.me, Slice collected its customers’ emailed Lyft receipts from their inboxes and sold the anonymized data to Uber. Uber used the data as a proxy for the health of Lyft’s business.
Ah, Unroll.me, the innocent, convenient service for cleaning up your inbox — I wonder how they make their money? Now you know.
Ironically enough, a lot of people started deleting the service when the news broke — who wouldn't, it's creepy! — so the CEO wrote a blog post detailing just how sad it was for him to find out people were upset to be data minted:
It was heartbreaking to see that some of our users were upset to learn about how we monetize our free service. And while we try our best to be open about our business model, recent customer feedback tells me we weren’t explicit enough.
Give me a break, you knew it was deceiving, but your company had hoped nobody would find out the extent of it. In technology, most of us think of data mining as a necessary evil and don't think twice about the users, because it's "normal" to us.
It's easy to forget that for most of us it's fairly obvious when a service is free, we're the product, and we're making a small mental trade-off for whether it's worth it — but make no mistake, everything you use and don't pay for is doing something like this, and they're probably selling that same data to companies that make you feel even more gross than Uber.
Your average user has no idea what business model they're subscribing to, just that the service is somehow there, and probably will be forever, but they don't think for more than a second about how you're funding it by scraping every single email they've ever sent.
One more thing: think services selling your email give a shit about where your data goes? If this Hacker News comment is anything to go by, it's doubtful you have much privacy left once you connect to one of these services:
At the time, which was over three years ago, they had kept a copy of every single email of yours that you sent or received while a part of their service. Those emails were kept in a series of poorly secured S3 buckets.