A.I. Calvin and Hobbes Strip ⇥ social.lol
This, from Adam Newbold, is a perfect encapsulation of a bunch of ethical problems related to artificial intelligence. The prompt:
Generate an image for a Calvin & Hobbes strip. Four panels. Calvin and Hobbes are walking through the woods, talking to each other, both holding smart phones and looking at them intently the entire time.
Panel 1: Calvin says to Hobbes, “This strip was made entirely with ChatGPT, which should be impossible given the strict intellectual property rights restrictions on Calvin & Hobbes content.”
Panel 2: Hobbes responds to Calvin, “Oh? Then how did it make it?”
Panel 3: Calvin responds to Hobbes, “Some guy just typed this into a box and clicked a button. That’s all it took.”
Panel 4: Hobbes responds to Calvin, “That’s so fucked up.”
This is entirely doable without generative artificial intelligence, but it requires far more skill. The ease of this duplication is maddening. I find this offensive in exactly the way Newbold intended it to be.
More important, I think, is the control exercised over the likenesses of Calvin and Hobbes by the strip’s creator Bill Watterson, as Newbold noted in the strip. Watterson famously rejected all but a handful of licensed merchandising ideas. But the mechanism for how he might protect this is the same as the one used by Disney when it fights parody and reinterpretation of its vast intellectual property, even though the motivations are different. Watterson’s protective quality is admirable, driven by artistic integrity to the extent he has left many millions of dollars’ worth of tchotchkes on the table to retain the spirit of the strips. Disney’s is entirely business motivated, evidenced by the tens of billions of dollars in licensed tchotchkes sold last year alone.
This is not the first “Calvin & Hobbes” strip made with generative A.I., nor does generative A.I. begin and end at self-referential prompts like these. Some assholes have created plugins — more-or-less — to badly emulate Watterson’s unique style in generative A.I. programs. It is awful.
I want to live in a world where we can differentiate between the necessary reinterpretation of intellectual property while respecting the wishes of artists. This is a tricky line, I know. It requires us — individually, but also the organizations responsible for generative A.I. stuff — to think about who is making such a request in good faith, and decide whether we are going to honour that.
One more thing: Watterson is a pretty private person, rarely giving interviews. But, right above this paragraph, I think we can get a sense of how he might feel about this.