In June 2022, Constant stirred up the format of our typical worksessions and organise a series of 4 afternoons under the title of Weak signaaal at cafe La Vieille Chechette. We searched for weak signals as a practice that centers sensibilities towards what easily goes under the radar, is barely perceivable, or straight up unclear. This workshop was a play on word2vec, a model commonly used to create ‘word embeddings’. Word embeddings is a technique used to prepare texts for machine learning. After splitting the writing up in individual words, word2vec assigns a list of number to each individual word based on what other words they find themselves in the company of. Once trained, such a model deducts synonymous words from comparing contexts, or will suggest probable words to complete partial sentences. With word2complex Cristina Cochior (Varia) proposed a thought experiment to resist the flattening of meaning that is inherent in such a method, trying to think about ways to keep complexity in machinic readings of situated text materials.













