“But I Can’t Do It Without You!!” ~ us to AI, probably

So my girlfriend was getting on me the other day for using Google Maps — a life saver, really — to get back home, when I have literally driven the streets that I was about drive in Atlanta more times than I can think of.

Her?? Questioning my navigating abilities??

Yeah ooook.

So I took on the challenge and turned off the GPS off (it was like maybe a 10 minute drive with less than 7 turns), but I’m ashamed to admit that for brief moment when I put the car in drive…

I felt lost. Naked even.

In that moment, I came to the realization that as many times as I’ve driven in Atl, I have ALWAYS had my GPS on to get places.

But hear me out! My justification is that Atl traffic is gonna be Atl traffic, and at any moment a huge accident could happen between me and the spot that I’m trying to get to. So I don’t wanna get caught up by a road shutdown and have my 15 min drive — because everything in Atl is 15 min drive away — turn into a 40 min drive.

In reality though, that’s not true. If I really wanted to, I could just check my maps before I go somewhere to make sure that I-285 didn’t catastrophically blow up, then head to wherever I’m going.

But using GPS feels efficient, and it lets me not really think or pay too much attention to my navigation when I drive. Apple Maps tells me to go, and I follow. I’m not present, nor do I actually learn about my environment, streets, and all the possible ways to get to the same spot.

But what other costs does GPS convenience come with?

Shamefully, I think that if I was more than 20 mins driving time away from home and I didn’t have access to GPS, it would take me some concerningly serious time to find out how to get back. (it might not be too bueno for me if my phone dies and I’m up in North GA somewhere)

So why do I mention my atrocious navigation abilities? Because I think about how this relates to AI when it comes to our thinking and writing.

AI is everywhere.

Businesses are using AI, I’m learning how to use AI (which this post was not, btw), you’re probably learning how to use AI, and kids are growing up with AI.

(I legit watched a kid shout out ChatGPT for helping her pass through high school in her speech at my brothers graduation ceremony – it was comedy to see the teachers reactions).

Word on the street is that there are people out here using ChatGPT like a magic conch (read: genie lantern). Like the wake up in the morning and asking GPT, “should I drink water?”, type of use.

Ok — maybe not that bad, but people are using it for simple, common sense choices that a person could (and should) make for themselves with a shred of internal reasoning.

People are also using it to generate essays, write emails, and send any type of message out to people, without putting any thought or effort into the communication itself that’s generated. Which makes me curious about how our relationship with AI will develop in the future.

Don’t get me wrong, AI is an extremely useful tool (I use it almost everyday to help me get stuff done) but what happens if people completely outsource their writing and their deductive reasoning to a Large Language Model (LLM)?

Reasoning and decision making becomes a muscle that atrophies, and skill that we’ll no longer use.

Then what happens when multiple generations that grow up with outsourced decision making, momentarily —or permanently— no longer have access to said agent that helps them with their choices?

Will we flail around like fishes out of the sea dying to be thrown back in? idk

It will be interesting to see how my generation (Gen Z) and the ones to come will grow with AI as a part of everyday life.

Hopefully as their collaborator, and not as their end-all be-all.


Leave a comment