top of page
Search
Writer's picturejames girouard

Ravelling to an Algorithm

Updated: Aug 19

To continue on my own learning about artificial intelligence, I tried to get down to the base idea of what this 'thing' actually is and what it is actually doing. I am curious about how a machine can make informed choices. What is it actually doing 'under the hood'?



Fuzzy Logic is a rug that I hooked to try to wrap my head around this question. It depicts a sequence of dots in rows of colors from five dark blue dots on one end, transitioning through lighter blue, greens, yellows and pinks to finally result in two red dots on the other end. Each column or row of dots is echoed along the border in bands of corresponding hues. Between and among the rows of cookie-sized dots are black lines that connect each row of dots to the next row, all on the warm white striated background.

Each color on the rug has a different feel, and represents an evolution of an idea. In fact, each builds upon each other, being informed by the previous. Each dot influences the next. As the idea or color transitions, it carries with it the weight of the previous color, or idea. These suggest, or imply a hue, or meaning to the subsequent dot, or nugget of information. This transition is essentially what is happening in a generative ai large language model.

As an ai crunches away at the inputs that you feed it, it considers the words you give it, looks at likely combinations of those and all other words that it has access to, and pieces together something that essentially funnels fuzzy ideas into proximity of the most common words that it sees around the original inputs. Said another way, it takes a piece of language, 'considers' the likeliness of another word or phrase, and spits out the most likely combination. Most of the time, this is a pretty good way to go about things, but there can be problems.


It can really be troublesome when the output of ai is wrong. Since these combinations of words should go together, how might the machine make mistakes? One of the main reasons for this is bias. A bias in an algorithm can also be unwieldy and veers from the straight lines of data that we would hope a computer would use. Algorithmic bias has to do with skewed data, or a poorly functioning system. Bias in ai can have really disastrous results so much so that problems from ai bias are being addressed as challenges to fundamental human rights. Personally, I think its pretty cool that this term is also used in textiles to describe a diagonal to the grain of fabric that is more stretchy and unwieldy than the lengthwise and crosswise grain.

Fuzzy Logic has an algorithmic (and a sewing) bias that I noticed about halfway through building it. Have a close look to see if you can find it. When I noticed it, my first thought was to tear out and make it perfect. But as I considered my purpose in building this rug, I made the conscious decision to leave this unintentional bias in it. Further, as the whole piece came together, ideas about the bias were running through my head and the entire meaning of the rug was determined to emerge. This rug, in its rainbow of colors is commentary on gender and gender bias in ai


Simplified into a spectrum of colors, the rug can be seen to reflect Pride. Considered more deeply, the pink and blue traditional gender stereotypes are almost at opposite ends, on a spectrum. The bias, in case you haven't seen it, is with the pinks that are a little closer to the borders on one side. Maybe it's skewed toward the feminine, but in our regrettably patriarchal society, it is likely opposite. It's hard to unsee it now, but I'm ok with that. In fact, I like it. This is one of those times when an accident turns into a feature. I wonder what the algorithm will say about that?



32 views0 comments

Recent Posts

See All

Comments

Couldn’t Load Comments
It looks like there was a technical problem. Try reconnecting or refreshing the page.
bottom of page