Site icon I Manage Products

PRODUCTHEAD: This article was NOT written by AI

Producthead logo

PRODUCTHEAD is a regular newsletter of product management goodness,
curated by Jock Busuttil.

product on the ladder #


tl;dr

When there is a financial incentive, people will industrialise technology to automate money-making activities

Authoritative-sounding yet factually inaccurate content generated by AIs is harmful

Product managers should be primarily concerned with what is best for users from an ethical point of view


a favour: please share this with other product people

every PRODUCTHEAD edition is online for you to refer back to

hello

Dystopia warning

If content is king, what are we to do when content generation occurs solely to satisfy the whim of online advertising algorithms?

Have technology, will get rich quick #

Whenever some mechanism exists to allow people to make money (particularly if automated), people will figure out ways to game the system in order to get rich quick with minimal effort. We see examples of this all over the place.

On the relatively small scale, sales people tend to take the path of least resistance to their commission.

When cryptocurrency mining was still a thing, people figured out that they could make significant profits for essentially no effort despite the massive capital outlay on data centre-sized mining rigs, and the other associated costs (power, cooling and so on).

Telephone scammers figured out that they could industrialise their operation by automating their outbound calls (“robocalls”). These systems would make millions of outbound calls per hour. With a response rate of 3 to 5 percent, the economics of scale permitted this type of fraud to be profitable.

And as soon as social media networks became less about people interacting with each other, and more about content creators, brands and influencers seeking engagement to monetize their audiences, those that figured out how to “win” the algorithm saw a corresponding uptick in their revenues.

This too has become industrialised, and unsurprisingly this isn’t a good thing either. James Bridle’s bleak essay about the not-so-hidden horrors of YouTube Kids is a particularly disturbing case in point.

Hey Google #

Google’s current search algorithm tends to prioritise original, longer-form content written by people, for people on a particular topic.

“While E-E-A-T [Experience, Expertise, Authoritativeness, and Trustworthiness] itself isn’t a specific ranking factor, using a mix of factors that can identify content with good E-E-A-T is useful. For example, our systems give even more weight to content that aligns with strong E-E-A-T for topics that could significantly impact the health, financial stability, or safety of people, or the welfare or well-being of society. We call these “Your Money or Your Life” topics, or YMYL for short.”

Creating helpful, reliable, people-first content”, Google Search Central (retrieved 22 January 2023)

Soon after, it felt like every business had swapped strategies. Instead of publishing keyword-stuffed infodumps, they’d hired swathes of copywriters to churn out plausible if superficial content on topics they believed to be relevant to their target customers. (I was approached by a few, and declined.) If you search for “what is a product roadmap” on Google, most of the non-advert hits are from companies selling — you’ve guessed it — product roadmap software.

Now, lest I start sounding like a massive hypocrite given I write about product management and run a product management company, if the intent of the content is simply for experts to share their knowledge and experience with peers, then that’s absolutely fine. More wind to their sails.

If however the content is being written to spec, by people with no background in that particular area of expertise, mainly by paraphrasing other people’s original work, solely to drive more traffic to the company’s website, then it’s just unnecessary extra noise. It adds nothing new to the conversation. It does not progress the state of the art.

CNET has been quietly using AI to write consumer advice #

A particularly egregious example of this has been unfolding over the last few days; you may have seen the story in the tech news. As reported by Frank Landymore in Futurism, in November 2022 CNET Money started publishing articles written solely by “automation technology”. These included topics such as “What is Zelle and How Does It Work?”. In other words, convincingly-written articles on a specific topic, generated by a proprietary algorithm trained on data sets of other people’s writing. Given Google’s own guidelines, it’s probably no surprise that these articles have appeared initially in the area of consumer money advice — “Your Money or Your Life” topics.

Mia Sato and James Vincent delved deeper on The Verge:

“Various affiliate industry sites estimate the bounty for a credit card signup to be around $250 each. A 2021 New York Times story on Red Ventures pegged it even higher, at up to $900 per card.

Viewed cynically, it makes perfect sense for Red Ventures [current owner of CNET] to deploy AI: it is flooding the Google search algorithm with content, attempting to rank highly for various valuable searches, and then collecting fees when visitors click through to a credit card or mortgage application. AI lowers the cost of content creation, increasing the profit for each click. There is not a private equity company in the world that can resist this temptation.”

Inside CNET’s AI-powered SEO money machine”, Mia Sato and James Vincent, The Verge (19 January 2023, retrieved 22 January 2023).

Aside from drowning out articles written by actual humans, this AI-generated content suffers from the same problem as the popular ChatGPT: it sounds plausibly authoritative, yet is often factually incorrect. Sure, human authors make factual errors also, but they are not automatons which can relentlessly flood the internet with misleading information.

As of Friday 20 January 2023, CNET has paused all AI-generated content “for now”.

The product management angle #

Not too long ago, I wrote about what I believe to be the underlying principles of product management:

  • User needs first, business needs second
  • Understand the “why” and the context
  • Fight uncertainty with frequent research, experimentation and data
  • Be human
  • Do no harm
The unifying principles of product management”, Jock Busuttil, I Manage Products, 25 August 2021 (retrieved 22 January 2023)

Most product managers work in for-profit companies. The lure and corporate pressure to make money from products can easily undercut the first of my principles. Just because you can do something with technology, doesn’t mean you should.

It puts the needs of users a distant second to create valueless, synthetic content intended primarily to increase search engine rankings, and in turn make more money. And that’s regardless of whether the content was generated by underpaid copywriters or a fully-automated AI.

Likewise, it is harmful to generate misleading or factually incorrect content at any scale and offer it to people under the guise of authority.

Final thoughts #

This week I’ve also included an article by Cory Doctorow in which he rails against product managers for being complicit in the “ensh*ttification” of platforms such as TikTok and Amazon, where the compulsion to milk all sides of their captive markets of suppliers and consumers for all their worth ultimately kills the platform.

Product management is more about the “why” than the “how”. Asking how to grow a product’s user base or financial success are less meaningful questions than asking why we want to do these things. If the answer to “why” is “to serve the needs of the company, investors” (and so on) rather than “to serve the needs of our users”, then we have the wrong intent.

We often talk about differentiating between what users want and what they need. This isn’t just semantics, this difference can have profound ethical implications. Giving or selling people what they want may be the fast route to corporate profits, but what people want isn’t always what’s best for them. It would be deeply unethical to have a cake stand next to a weight-loss clinic, although it would probably do a roaring trade.

This is why we do our best to put aside our personal opinions, experience and bias when examining the people whose problem we’re trying to solve, and their context. We’re trying to get past our users’ wants and desires (and our biased perception of them) to the factual evidence of their underlying needs. What is best for them is what they fundamentally need, and the test of what they need is what’s best for them from an ethical point of view.

Speak to you soon,

Jock

As ascerbic UK political commentator Tan Smith would say.



what to think about this week

Inside CNET’s AI-powered SEO money machine

Fake bylines. Content farming. Affiliate fees. What happens when private equity takes over a storied news site and milks it for clicks?

When bots start publishing content

[Mia Sato and James Vincent / The Verge]

SEO spammers are absolutely thrilled Google isn’t cracking down on CNET’s AI-generated articles

In the wake of our reporting that CNET had been quietly publishing dozens of AI-generated articles, many expressed dismay at what seemed like an underhanded attempt to eliminate the jobs of entry-level human writers while downplaying the shoddy content to the site’s readers.

One group was absolutely thrilled, however: spammers, who could scarcely contain their glee that a mainstream publisher was getting away with churning out bot-written content — and immediately expressed plans to do the same.

“Time to pump out content at ultra-high speed.”

[Frank Landymore / Futurism]



CNET pauses publishing AI-written stories after disclosure controversy

In a staff call on Friday, CNET leadership told staff it was pausing all AI-generated content for now. Top executives at Red Ventures, the firm that owns CNET and other websites, also offered more details on the company’s AI tool.

Going dark until the heat dies down

[Mia Sato / The Verge]

Tiktok’s ensh*ttification

Here is how platforms die: first, they are good to their users; then they abuse their users to make things better for their business customers; finally, they abuse those business customers to claw back all the value for themselves. Then, they die.

It’s okay. We don’t need eternal rulers of the internet

[Cory Doctorow / Pluralistic]

recent posts

The unifying principles of product management

A recent tweet by John Cutler provoked some interesting reactions. It got me thinking about whether there are unifying principles of product management that apply in all contexts.

Constant underlying principles, tailored approach

[I Manage Products]

Can I pick your brain about product roadmaps?

Hey Jock,

I would love to pick your brain about product roadmaps.

To give more clarity to my team (which is working on two mandates/sub-projects), I would like to empower them with a roadmap/timeline of the project. We already have one, on Confluence that I’ve printed next to our whiteboard. But it’s not “ours”.

Any recommendation in terms of shape/content?

Show everything your team is working on

[I Manage Products]

Will platforms conquer the world?

Product managers of software and hardware platforms face unique challenges that PMs of ‘regular’ products do not.

In this panel discussion, Hans-Bernd Kittlaus discusses platform product management with Samira Negm, Peter Stadlinger and Jock Busuttil.

Or have they already done so?

[I Manage Products]

upcoming talks and events #

We have a couple of dates coming up for our Advanced User Research masterclass.

Friday 10th February 2023 · 10:00 – 12:15 (UK time)

Friday 24th February 2023 · 10:00 – 12:15 (UK time)

This live, online session is an introduction for product managers and their delivery team to advanced user research concepts and techniques.

If you’ve already had some experience of discovery research and user testing but want to go deeper, this masterclass will provide you and your team members with an advanced user research toolkit. You’ll grow your interpersonal skills, which will be valuable in other areas of your work too.

We offer subsidised pricing for individuals paying their own way.

can we help you?

Product People is a product management services company. We can help you through consultancy, training and coaching. Just contact us if you need our help!

Helping people build better products, more successfully, since 2012.

PRODUCTHEAD is a newsletter for product people of all varieties, and is lovingly crafted from {ERR – text placeholder not found}. Just joking.

Exit mobile version