AI is sorta the new EQ

I sign up for a lot of webinars but rarely listen to them in real time. It depends on the subject matter and the presenter. A webinar a few weeks ago was an exception. The presenters have done engaging and informative webinars and their thoughts are spot on with what I’ve seen. This was 2026 B2B Content and Demand Generation Predictions by the team of Robert Rose from the Content Marketing Institute, Jane Qin Medeiros (a brand leader at Informa Tech Target), and hosted by Lauren Smith from Informa Tech Target. They talked a lot about how AI is supposed to make marketing teams work smarter while helping salespeople sell better.

Robert (who closely resembles Butch Vig, the legendary producer of Nirvana and Smashing Pumpkins, and founder of Garbage) said something about AI not being able to make bad copy sound better. This instantly reminded me of a common misconception about music. In the late 70s, a technique known as Graphic Equalization (EQ) crossed over from the telecom industry to be introduced into musical equipment. And like my previous blog post about vibe coding, you’ve probably used EQ without realizing it. If you’ve ever messed around with the tone controls of a car stereo to make a song sound brassier or brighter, you’ve essentially used an equalizer. The graphic EQ splits the signal spectrum into bands, so you have greater control over low or middle or high frequencies. Manufacturers of mixing consoles, effects pedals, guitar amplifiers, and PA systems soon incorporated graphic EQ into their products. It became necessary equipment for serious musicians and recording studios. Every guitarist’s effects rack (mine included) just had to have one. I used an equalizer in my rig to give my solos a little extra oomph, that little touch of something to boost above the rest of the band. Others used one to boost the lower or higher frequencies of their sound. So many musicians used them that it was starting to become scarce to find a record that was not recorded with an equalizer. Then the rumors started. The prevailing rumor was that EQ could take a flat note and sharpen it. Not true at all. What you got was merely a flat bum note, just embellished with a little brightness. EQ turned out to not be the savior of talent-challenged singers.

So this is where AI wanders into the equation. Can AI make bad copy sound better? I don’t believe it can. Sometimes it just clutters the copy and confuses the writer. When guided properly, AI can help get you unstuck. It may help you reorganize content, for example it can help you chunk up large chapters or paragraphs, or make it flow better. But can it persuade? Not really. Can it understand the audience? Only if you feed it specifics. Can it make you a better copy writer? Dangerous question.

Remember, humans in the loop is the mantra. AI can be a great creative aid, but it works best as a polishing tool rather than a replacement for human strategic thought.

I was Vibe Coding and didn’t know it!

Last week I attended an informal networking via Zoom that soon veered into Vibe Coding. Confession time – I was unfamiliar with the term and had to consult my friendly neighborhood search engine. It turns out that I was already a Vibe Coding rock star without knowing that’s what I was doing.

Vibe Coding is a fairly new (first documented a year or so ago) technique, in which developers use natural-language prompts to have AI agents (think Claude or ChatGPT) to write, debug, and iterate on code. According to its early adopters, it emphasizes rapid prototyping and “feeling” the code’s direction over manual syntax writing. So you are leveraging AI to help improve the code generated by humans.

My use case involved developing landing pages. Salesforce Account Engagement (formerly known as “Pardot”) is our marketing automation platform of choice. Although it produces nice emails and has some wonderful list-making features, the landing pages it produces are plain vanilla at best. Frustrated, I booked a session with Salesforce’s success team to discuss options for landing pages. Their answers ranged from denials to stifled yawns. Yes, they know their tool produces terrible landing pages, but enhancing the landing page capability is nowhere near the product roadmap.

So I went on an expedition to find landing pages that I liked and grab the source code. With more than 30 years of HTML programming (YIKES!) under my belt, I thought modifying the source code would be a breeze. Some of the pages were fairly easy to modify, but the challenge was getting the code to play nice in Salesforce and also the company’s website platform. The website is done in Squarespace. Why a company that provides consulting and claims processing services to health plans would have a website platform designed for consumer e-commerce is a mystery. Save that rant for another day. I was struggling a bit with the source code, so I thought I’d ask ChatGPT for help. So I fed the source code into ChatGPT and asked how it could be modified to suit. There was a lot of back and forth with ChatGPT, but ultimately – after a few hours of restating, reframing, and revising – it advised me on what code changes needed to be made.

The back and forth with AI is important to note. AI is only as good as the human that is using it. If you don’t know how to question AI’s output and course-correct, you are in for a huge problem. We’ve all seen AI slop. Vibe Coding is just the latest wrinkle in the story of how AI can make us smarter and better at our jobs. But you can’t just let it go off on its own. Humans have to be in the loop.