So, about this AI thing:
Some of you may know I do some other journalism for money (look, it’s hard keeping my baby in real estate). Lately I find that when I use reporter services like Help a Reporter Out, with increasing frequency the pitches I get sound like they were written by machines.
Let me explain. Here’s a query I put in HARO the other day:
In a recent provider call the Centers for Medicare and Medicaid Services (CMS) suggested doctors who perform Telehealth from their home offices will have to add their home address to their Medicare enrollment …
This raises a legal issue: Doctors often live in residential areas from which local zoning ordinances exclude businesses.
Freelancers often work from home offices and this is not an issue, but would registering one's home as a place of business with a federal agency create a zoning issue for these providers?
Now, I’ve hooked up with some great sources through these services over the years, but I’ve also gotten some extremely bogus ones. Some of them are clearly not from the field in question, or even acquainted with it, and to the extent that they bluff some expertise they mostly embarrass themselves.
Generally these bogus sources have one objective: To get themselves (or whoever hired them) into the press.
It’s not so long a shot as it sounds. For one thing, reporters — who, you should be aware, are mostly not celebrity feature writers scarfing canapés at the publisher’s mansion in between long lead times, but poor overworked bastards filing to multiple outlets for nickels and dimes — are happy for any content they can get. So if someone will speak to them on the record who at least sounds like they know what they’re talking about, if they’re not obviously wrong (like “gorillas subsist on Zwieback toast” wrong), they’re a blessing. If the fifty to a hundred words they give you aren’t really adding anything to the story, they’re at least fifty to a hundred words, and a named source.
Cynical as I am I don’t go for that kind of thing. If the respondent seems suspicious, I try to draw them out with more specific questions; 98% of the time they just don’t respond. Many don’t even require that level of scrutiny.
One particular response to my zoning query was an obvious loser, but in a way that reminded me of several I’ve received over recent months. Have a look:
When doctors adopt telehealth practices and provide medical services from their homes, zoning regulations can become a relevant consideration. Zoning refers to the designated land use regulations established by local authorities to maintain order and ensure the compatibility of different activities within specific areas. In the context of telehealth, the primary concern is whether home-based medical consultations violate zoning restrictions that typically categorize residential areas as separate from commercial or professional spaces. While zoning regulations may vary, many jurisdictions have recognized the need for flexibility and have adjusted their rules to accommodate telehealth services. However, it remains essential for doctors engaging in telehealth to familiarize themselves with local zoning regulations and seek necessary permissions or exemptions to operate within the legal framework. By adhering to zoning requirements, doctors can uphold professionalism, legal compliance, and maintain the integrity of their telehealth practice.
This doesn’t sound like the usual fake, who generally has the manner of a bore at a party vamping to pretend he knows what the conversation is about. This one follows the logic, at least, of a bad term paper, but while bad term papers have at least some odor of flop sweat, this one is so slickly constructed that if one were only assessing confidence and not content, one might get the impression the guy knows his onions.
The creator seems have very methodically extracted key concepts from my query — zoning laws, telehealth, home-based labor — and found some information related to those concepts — flexibility, compatibility, professionalism, exemptions, legal framework — to dress the branches with leaves, as it were.
In other words, like the earlier frauds, this one says absolutely nothing — but with utter sangfroid and almost preternatural attentiveness to rules of grammar and old-school essay format.
To my mind there are two possible explanations. Either someone spent a lot of time creating this shiny nullity — more time, certainly, than it could be worth to any press-hustling client, which makes it the more far-fetched — or it was written by a machine.
I predict that, as the AI algorithms improve and better simulate their human models, they’ll drop some slight imperfections like dangling modifiers into the mix to throw those few editors who even give a shit whether it was written by a person or a machine off the scent.
I won’t say much more than that about the subject (at the moment — you can bet I’ll follow up later) except this:
Everyone in what we normally call communications, whether it’s corp comm or journalism or screenwriting or whatever, is trying to put something over. Sometimes it’s the truth, sometimes it’s a product. I know there are some geniuses whose work flows fully formed from their brows like Athena from the head of Zeus but most of it is done by a process known as craft. That process is frictive and tedious and sometimes the creator (not me, gentle reader!) weakens and falters enough to either tell the truth if they’re trying to lie, or to lie if they’re trying to tell the truth. And you can hear it. As humans, who are all of us in the communications game, we have developed instincts to hear both the truly false and the falsely true note.
But the machine is beyond all that. It is in the business of perfecting its craft — that is to say, of perfecting itself. And it is not doing this to perfect the transmission of the truth. It is working — ceaselessly, tirelessly, thoughtlessly — on the perfection of the lie.
That AI example just keeps spitting back the issues, concerns, needs, etc., like a student spinning wheels on a Sociology exam. Actual journalism might have "While zoning regulations may vary, many jurisdictions..." followed by actual reporting: "In San Jose, CA, for example, regulations..." Or "it remains essential for doctors engaging in telehealth..." with "Bob Smith, a proctologist in Phoenix, was sued by the city government..."
What struck me most is that it sounded exactly like the endless Megan McArdle boilerplate that enfolds her libertarian opinions. Could she have been an artificial intelligence all along? Or does the word "intelligence" nullify that possibility?
I'm just pissed because I paid good money for that Zweiback toast/gorilla info.
Pat Robertson is dead.
When my mother - in - law passed away we discovered she was making an automatic 600 dollar a month donation to Pat .
This was while we were helping her out with groceries and utility bills.
I'm not a fan of Pat's.