good intentions and how the internet doesn't forget

This week was a little wild. So we’ll be a little short today. I’ll have to renegotiate the pay by the word agreement I have with Rick in accounting. Fixed price is what I’ll be lobbying for moving forward. It’s just a great model, right???

I have a document each week where I dump links I found interesting. Usually there’s some theme that emerges based on that pile of words I can channel into one of these things I post in random each week.

This week I only have two articles I saved and I’m not sure what it says about me or my opinions on tech this week.

First one is a bit of a downer, but actually a fantastic look at data, privacy and how algorithms model our behaviors online: I Called Off My Wedding. The Internet Will Never Forget.

As technologists we have a certain responsibility to be as up on this stuff as possible because we never know when something we make will stumble into this territory. I like to assume the Pinterest team built their product with the best intents and these issues will still arise as things scale, shift and grow.

Case in point this pretty damning quote:

“The internet doesn’t know or care whether you actually had a miscarriage, got married, moved out, or bought the sneakers. It takes those sneakers and runs with whatever signals you’ve given it, and good luck catching up.”

This above was in the context of how Pinterest and many other ad networks are tuning their algorithms on what it recommends you each week. As someone that googled things like engagement rings AND cloud technologies, I can tell you once the internet gets a piece of you it thinks is $$$, it has a hard time letting go of it.

That article makes a strong thesis around the data that drives these experiences, and that as users we should have access to a chisel and not require a sledgehammer to tune these experiences. The sledgehammer in this instance is going HAM and deleting your internet presence entirely, which I think I consider on the daily but simply am too weak to pull the trigger.

Yes deleting all that data would solve the issues raised in the article above, but that some of that data also is what is powering extremely effective recommendation engines. And I don’t think this is all bad, I think the tech just got out over its skis a bit.

So on the point of extremely effective recommendation engines, we look at the weird result of the tech industry making experiences so compelling / addicting / enthralling that the Pixel product team has dreamed up something to help us snap out of it.

Google’s Pixel has a new feature that’s a brilliant insult to your intelligence. I put the headline first cause I think it’s pretty funny, but the meat here is our pals Google are making it so our phones tell us to look up so we don’t walk into stuff like fountains, poles or traffic.

Or rather the official name: Heads Up.

I’m pretty into the digital wellbeing initiative from Google. I think it’s at least TRYING to solve some for some of the mess it’s had a very dramatic hand in creating. However, I think this line sums up my feelings on the feature as well:

“You’d think we were better than this. But no, we’re not.”

So here we are. We have this crazy network of software that is creating all these compelling things we can’t be bothered to look out for our well being. And it’s doing it so well, it won’t let us let go of our failed marriages, ex-friends or tragic events.

And it keeps coming back to one theme - How do we get back in control of our own data that creates these loops that keep us glued to our devices?

I think data itself that drives this messy problem is only one piece of a broader issue. Also, getting on top of that will take a tremendous amount of coordination from both large tech companies, their ad networks and probably regulation.

There are two other key factors that I pose that have created this weird future we are hurtling through: The context and relationship we have with our data. In our experience with user testing and anecdotally at large: By and large we don’t know how our data is used and we look at it like a black box.

That’s where I think we can make moves. It’s the smaller players (like… US) that are also going to bring the novelty that can result in these shifts in perception. We can make competitor products look kinda evil by having a better presentation of the context of recommendations, data captures, etc - that in turn change users’ relationship with their own data, our client’s product and ultimately their brand or employee perception.

Obviously this is all back of the napkin stuff - I am sure a lot of these ideas are either impossible or would never fly. And a lot of it we don’t have direct control over by nature of our agreements. But also, if we don’t start asking about this or pushing our clients on it, who will?