I’ve been working on various projects over the last few months, some of which I will eventually talk about on my FlashForwardFriday previews of upcoming work. Many of them are still in the research and info-gathering stages, and I thought I would use some Chat AI functions to see what it could give me. In the end, I used AI in four different ways.
Using AI as a brainstorming partner
The first was for a project I’ll talk about later this week, a review of music of different years. I’ve written a previous take on 1943 as a year’s worth of music, what was going on, what music was everlasting, etc. But while I was planning to treat it as “A PolyWogg Guide to Music”, part of my ongoing series of serialized guides, I was having trouble with branding it. So I booted up the AI prompts and told it generally what I was trying to do … namely, look at the various lists of the “best songs of a year”, mostly from Billboard, and write my own review of a year’s music, comparing my list to the published one. With a touch of “Billboard got it wrong!”.
I had the Prompt give me a list of 10 titles to work from, and another list of 10 slogans to consider. I did a bit of iteration, using the AI to help me brainstorm, until I got to the end-point — “The Unforgettable Sound of a Decade”. I have a few variations of that I can work with, but that was about 200x better than anything I came up with on my own.
Using AI for simple research
Some big radio stations started making year-end music lists in about 1943, Billboard shows up a few years later, and then Billboard REALLY gets going in about 1958 when it creates the top 100 list (by which time LOTS of groups are doing lists). But if I’m going to review 1940, 1941, and 1942 too, i.e., so I can review “the 1940s”, I needed a good list for each of those years.
Now here’s where it got a bit interesting, and goes to the heart of AI’s fallacies. I told it to give me a list of the top 100 songs for 1940. It figured out that it should combine sales, other lists, other sources, and BAM it gave me a list. As I went down it though, I tripped over Boogie Woogie Bugle Boy by the Andrews Sisters. Which I already had reviewed, because … wait for it … it was released in 1943, not 1940. So, I asked the chat, “Hey why is that song on there when it wasn’t released until 1943?”. To which it basically said, oops, do you want me to revise the list and THIS time verify that each song is in fact listed as released and/or on the air in 1940? Sure.
Now, AI is only as good as its sources, and in this case, I suspect I know the problem. It found a list SOMEWHERE that had hits from the 1940s — with an S — and not just 1940. So there were probably OTHER songs on the list that were from other years too, I just happened to notice that one in particular.
Okay, so it generated me a new list, I started playing with it, and it seemed pretty legit. But then I noticed that a song by Bing Crosby actually showed up twice. Both listed as being by Bing. Now, I could see if it said Glenn Miller and Bing Crosby for one and Bing Crosby for another, the AI isn’t smart enough to tell they’re the same (relatively speaking). But I asked it again, “Hey how come (this song) is on there twice?” Another “oops” and it removed it down to 99 songs.
But then I noticed the problem that I expected — one where it was John Doe and another as John Doe with his band. It removed the second one, adjusted the list, and then asked me if I wanted to add more songs to compensate. So I had it add 25 more songs to the list, allowing for errors, repeats, etc. AND taking into account that I would want to find the songs and even using Apple, Spotify, Prime, and YouTube, not every song is going to be available for me to review.
Once I get past 1943, I have working lists, but I’ll still use the AI prompt to generate other lists to see what else maybe wasn’t big enough to make the Billboard lists but was still a big song in a given year. Just looking at the work it did, and taking into account it took me 2 or 3 tries to iron out the validity bugs, the final result was still way better and faster than I could have compiled.
Using AI as an image generator
I’ve already posted about some images I was generating with AI, and there are 3-4 in particular that have been helping me a lot. I was doing cover art for some guides I’ll be writing, and I wanted “logos” of sort for the upper right corner where I could have a red-eyed tree frog (RETF) doing something related to the theme. For music, it’s a RETF with some headphones on. For crafting, I have one where the RETF is cutting some construction paper.
I also had it generate some cover images. With some almost laughable results. One thing I wanted is a cover image for the guide to 1940s music. With a theme of going from “jazz to big band to swing”, I wanted something related to big band-ish performances. I’d love to use the Andrews Sisters, honestly, but all of their real images are for editorial use only. I tried finding some old-style orchestra/big band images, but again, either they weren’t quite right OR they didn’t have the right licenses available OR they were insanely expensive (think $500 for some images … great if you’re doing commercial, mine are passion projects only). I finally had some AI sites generate some images, and I realized that I wasn’t very good at the prompts. So guess what? I asked the AI chat prompt to give me a better description of the scene to give to the image generator. My images improved dramatically. And I got one I absolutely loved (see below).

It’s a great image, I love the idea of lots of brass instruments, done in B/W, there’s a woman singer/crooner there, it has a lot to recommend it. It’s not the Andrews Sisters, sure, but well, that’s not an option. I was ready to slap it on a cover, and in fact, I had actually put it into the template and was playing with it for some time to get it look right. Except I hadn’t noticed something. The singer has three legs. I was thinking for a moment she was on a stool, but no, she’s an alien or something. I don’t even want to think of how you buy shoes for that scenario. 🙂 So AI generation in this scenario is not perfect. I didn’t think I had to tell the AI to get the right number of limbs for humans. But well, some AI generators are better than others, and I’m working with the lower end of the middle-quality group. Yet without paying hundreds of dollars, AI generation was among the ONLY options.
And then for fun, I had it create an animated creature. I’ve recently started doing D&D after years of having very little interest. But then a friend was offering to DM, he knew I had SOME interest in how it worked, and so we created a crew to give it a go. Everyone else is experienced, I’m the newb. I will never be fully immersive, that’s not my style, but I like the premise and am enjoying creating my character. A dragon-born bard, silver in colour. I don’t have ANYTHING to use as a figurine that looks like that description, but that’s okay, we have other stand-ins. But I’m having some trouble remembering what various spells do, and I don’t want to grab the handbook every time, so I created my OWN Bard Cards (mostly because everyone seemed to be sold out at the time; I just finished my set this week, and they’re back in stock, grrrrr). Although, to be fair, my cards are better. AND I used AI to give me images to go on each card, as well as the back in some places for MY character. A little extra fun, some AI, some just straight image searching for personal use on my cards.
Using AI as an unpaid research assistant
I have two project examples where AI has done fantastic work for me. One is a variation of the classic complaint of professors, and the other is something much more mind-boggling.
So, one of my future projects is a review of the type of performance stories that public libraries tell. It’s a small itch to scratch, and I don’t have a lot of background info to even start the project. I have an idea, I know what I want to talk about, but I have bupkus for content. So when I was playing with some prompts a little while ago, I asked AI to see if it could help me. I asked it to review reporting by public libraries and to identify the top 10 indicators they used to report on progress. I got a response, tweaked it; got a response, tweaked it; got another response, tweaked it. And when I was done, the AI result was a really good framework of various indicators, grouped and cross-listed against types of indicators and functions that libraries perform, and well, if I was sure it was complete and accurate, it would be a really good outline for an article or book. For me, it’s just a starting point, really, and I’ll have to vet a lot of the info of course. For some students, this could be the outline for an essay on libraries.
I don’t think the AI writing is great, but as a starting point for research to then go ahead and write my own essay? Well, it would be pretty hard not to at least use it for brainstorming. I know how to use it “properly” and it isn’t about academic credit for me, so I’m pretty impressed with what it gave me in about 20 minutes of “work” by me. By contrast, I was curious how fast it could tell me all the public libraries in Ontario, for example. After working on it for about 2 minutes, the AI threw up its hands — it couldn’t make me a list, but it DID give me the URLs for about 4 organizations in Ontario that had membership lists I could perhaps use to generate my own list with some elbow grease. Interestingly, it could do the first work with a much more analytical component, but it had no chance at the simple “list”.
The next project is one that I’m not really ready to talk about yet. It’s a big fiction series that I have no idea if I’ll ever get to writing when I am retired since there are many non-fiction projects that I want to write first. But part of my hesitancy is that my idea is for something really complicated for world-building, long histories, and about a dozen books. It’s not quite Wheel of Time-level of complexity, nor Lord of the Rings, but I do have some rules that govern the world. My problem so to speak has been that I can’t just write book 1 even though I have most of it in my head mapped out. I don’t know if some rule I put in book 1 is going to totally mess me up in book 5. And there’s a continuity issue that I’m aware of which will happen in book 12, but it is also going to be apparent in the previous 11 books, without being TOO apparent.
ANYWAY, moving away from the context, I’ve been hesitant to do any of the work as I feel almost like I have to do the research for ALL TWELVE books before I even write book 1. Yet one afternoon, just playing with the chat function, I gave the AI a request to look at (blah blah blah) throughout history and how it developed in the UK in particular. Then I asked it to compare the UK history with French history, finding similarities. Then I added about 12 other cultures. Multiple dimensions. Multiple considerations. More formal in places, etc. And each iteration, I took it farther. After about 90 minutes of work to give it a really good prompt request, I had it generate about six pages of amazing information about the history.
Is it perfect? No, of course not, but it gave me all the basic backstory for the plotlines for book 11 in 90 minutes. I tweaked it, ran it again, and I had everything for book 8 in about 4 minutes. In short, I can not only have it summarize a ton of material that I was expecting to spend a year reading and researching, it actually did ANALYSIS of how the pieces fit together, I can replicate it across 12 books, and I have 18 months worth of research basically doable in an afternoon. I could start the book tomorrow if I wanted to do so. A book I didn’t even think I could DO anything with for another 4-5 years. After only an afternoon’s worth of “research” by my assistant.
Am I worried about using AI?
There are lots of people out there who are worried about the use of AI and that people will use it to write new books at a book a day and destroy publishing or sales. Meh. AI can write stuff, sure, and it’s better than the average high-schooler. But I think “proper” writers who use it quickly figure out how to use it correctly and for what ends. Writing? No! Unpaid research assistant? Of course!
