Tuesday, January 19, 2021

Genuary 2021 Day 19a : "Randomness Increases (Variant)"




I did the Day 19 prompt last night, letting it run overnight. I woke up around 2:30, and saw the first image coming out, and I was intrigued that the various stacks of cylinders looked like people to me. Some people were purple, and a lot were green. And everybody's feet were a lot bigger than their shoulders.

So, I decided to try a variant, going more deliberately in the direction of representing people. I grabbed some realistic skin tones. I adjusted dimensions to make people's bodies a little more recognizable.

And there were several things that made me uncomfortable and disappointed in the finished drawing. Above, there's an detail clipped from the original which avoids many of the issues. 

  • There's a bug in my renderer (in the octtree code?) that sometimes does not render objects. When the renderer omits a body part, that's uncomfortable.
  • With stacks of shapes that read as humans, a viewer (at least this viewer) starts to read intentionality to things that humans would have control over. There's a clump of darker skinned people on this side, and a different clump of lighter skinned people on this side of the image. What does that mean? At one level, it means nothing. It's just what the dice picked this time around. But viewers will see human patterns, and if I present an image that evokes human patterns, I need to consider that I'm presenting those patterns, they come along.
  • The relative dimensions of "head" and "shirt" reminded me of a turtleneck sweater worn on a cartoon I enjoyed as a kid, and again, that's a connection I'm not entirely comfortable with. I don't enjoy the cartoon now, and the connection to it makes me sad.
  • The rest of my generative artwork is abstract and meaningless. By making this a little less abstract, giving it a little bit of representation of people, it opens up a lot of issues. Kate Compton had some comments about Tracery Grammars and Twitter bots touching on ethnic / social issues, and her message was that if you, as a creator, need to be able say to your audience that ALL of the outputs of your bot are valid and supported - it's not sufficient to shrug and say "well, sure, that's offensive, but that's just what the bot decided to generate".
  • I may be overreacting. I might be underreacting. I did a chunk of the work here on MLK Day, which might increase my sensitivity a little bit. Tomorrow is inauguration day.
  • How much of my reaction is the fact that a majority of the "people" that showed up in my work are brown-skinned? 

1 comment:

  1. The relevant Compton reference: https://vimeo.com/225566776 around 24 minutes in.

    "Is it OK to put ethnic or cultural symbols in there? Are people going to assume that everybody in that description is white? Do I put signifiers of blackness or other cultures in there? Do I put explicit signifiers of gender in there?

    And the trouble with this [...] if you put people's signifiers in a blender, you are implicitly saying that any combination is valid. And we know that that's not true. Or that any combination is something that doesn't have to be defended vigorously."



    ReplyDelete