How do we analyze a black box? Well, first we have to know what comes in and what comes out.
What comes in in case of ridesharing companies?
- Full ride history, including pickup/dropoff addresses, stops and routes taken.
- Timings. These are important. For example, someone getting picked up at the bar 40 minutes after it closed is probably not an alcoholic but one of the vendors.
- Fitness sensor data. Officially it’s used to detect harsh braking/acceleration, but obviously stored for further analysis.
- Address book contents, if you ever used a built-in “send an invite” feature.
- Social media profile(s) – again, if you ever used a built-in sharing feature.
- Relayed text and call history and contents.
- On-app tips, both given and received.
- History of complaints, both filed and received.
- Star ratings, both given and received.
- Camera. Call me a conspiracy theorist, but when I deny Uber app camera access in settings, it wants my fresh mugshot for “verification” within a few hours, while if I leave camera permissions untouched, it’ll go for months without a need of said “verification”.
To sum it up, the more you use Uber or Lyft, wether as a driver or a rider, the more it knows about you. If you use ridesharing platforms often, they probably know you better than your mother. Everything from speech patterns, cuisine choices and shopping habits to extramarital affairs. It’s all in there.
So, what’s in the box?
When Uber and Lyft argued in courts that they are not transportation but IT companies, they were actually correct. They are the “big data” companies, collecting humongous amounts of data on their users. It’s probably worse than Facebook.
From the driver’s perspective, I believe they keep a variety of numeric scores for each factor they deemed important for their “smart matching” algorithms – on both drivers and riders. Factors ranging from likelihood of you filing (or incurring) a cleaning fee to your sexual preferences. Remember that nasty chime the app makes when you exceed 80mph? That ticks your “likelihood of getting a speeding ticket” risk score.
Most importantly, if Uber or Lyft don’t know something about you, they’ll match you with a few “challenging” riders or drivers to determine your score.
A few observations from my experience:
- All of my Pool rides that are actually shared with other pax were gender matched, it was always male+female. A few notable exceptions were all gay.
- No messy pax for almost a year, then all of a sudden – 4 messes in 2 days… In a daytime. On a weekdays. Really?!
- Once in every few months I get a streak of obnoxious assholes that I have to kick out of my car. Somehow it *always* happens on a same week.
- Don’t get me started on tipping patterns. Comparing Uber and Lyft it’s obvious that these are artificial AF.
TL;DR: both Uber and Lyft collect ungodly amounts of data on their users and like playing god with it.
PS: Practical implication for drivers – if your market seems dead today, change your own behavioral pattern to restart a “testing” phase of the AI.