Pierre - Shaping vectors: Difference between revisions

This page was last edited on 31 January 2024, at 13:12.
(added concrete example at the end)
Line 50: Line 50:




[[index.php?title=Category:Content form]]
[[Category:Content form]]

Revision as of 13:12, 31 January 2024

A vector is a mathematical entity which consists in a series of numbers grouped together to represent another entity. Often, vectors are associated with spatial operations: the entities they represent can be either a point, or a direction. In computer science, vectors are used to represent entities known as features, measurable properties of an object (for instance, a human can be said to have features such as age, height, skin pigmentation, credit score and political leaning). Today, such representations are at the core of contemporary machine learning models, allowing a new kind of translation between the world and the computer [ [ I like the question „How to represent the fluidity of human knowledge creation with computation?“ The question is how to do this by using math? How to create raptures? and how to expand western thought? ] You seem to miss a lower level negotiation of how these algorithms create a fluid set of knowledges, maybe it is real to have gaps, boundaries and places where we can't match up? especially when we expand the critique beyond western thought, how do these spaces disable these voices and knowledges in a homogenisation. ] ].

This essay sketches out some of the implications of using vectors as a way to represent non-computational entities in computational terms, like other visual mnemotechnics did in the past (Jack Goody on the list, Bruno Latour on the perspective drawing), by suggesting epistemological consequences in choosing a particular syntactic system over another. While binary encoding allows a translation between physical phenomena and concepts, between electricity and numbers, and while Boolean logic facilitates the implementation of symbolic logic in a formal and mechanical way [

```

if !allowed_citizenries.contains(user.citizenship):
  throw new Exception()
else:
  pass

```

], vectors open up a new perspective on at least two levels: their relativity in storing (encoding) content and their locality in retrieving (decoding) content.

In machine learning, a vector represents the current values of the property of a given object, e.g. a human would have a value of 0 for the property "melting point", while water would have a value of non-0 for the property "melting point". Conversely, water would have a value of 0 for the property "gender", while a human would have a non-0 value for that same property. This implies that each feature in this space is aware of all the other dimensions of the space. Vectors are thus always containing the potential features of the whole space in which they exist, and are more or less relatively tightly defined in terms of each other (as opposed to, say, alphabetical or cardinal ordering). The proximity, or distance, of vectors [ I wonder how this shifts how understanding of perception? or why this way of seeing matters? ] to each other is therefore essential to how we can use them to make sense [ How do you see the link to Bayesian probability (not that I know much about this myself)  that seem to make sense (probability) by measuring distances in variables. Or ideas of diffusion and latency. It is really valuable to (as you do) contemplate the how statistical reasoning trelate to a semantic space. ] . The meaning is therefore no longer created through logical combinations, but by spatial proximity in a specific semantic space. Truth moves from (binary) exactitude to (vector) approximation [ does this two modes of truth coexist? [ approximations remove ambiguities ] [ is there the possibiliy of a lo-fi or impoverished truth? ] .

As we retrieve information stored in vectors, we therefore navigate semantic spaces. However, such a retrieval of information is only useful if it is meaningful to us; and in order to be meaningful, it navigates across vectors that are in close proximity to each other, focusing on re-configurable [ would be interesting to undertand the historical role of cybernetics and idea of a system, and later metaphor of network in the role that vectors have in modern computing [ “Simulation attempts to resemble the real, to ‘realize’ it, to bring out what is only implicit in it and make it explicit. But at a certain point in its progress it draws too close to the original, and further increases in perfection, instead of bringing the system closer to this original, only drive it further away. The system begins to reverse upon itself, gives rise to the opposite effects from those intended.” (Butler 1999: 25) ] ], (hyper-)local coherence to suggest meaningful structuring of content.

Given those the existence of features in relation to one another, and the construction of meaning through the proximity of vectors, we can see how semantic space is both malleable in storing meaning, structural in retrieving meaning, prompting questions of literacy. A next interesting inquiry is to think about the process through which this semantic space is being shaped [ Curious how you would differentiate the semantic space from the latent space? I think most artists working with the vectorial conception of machine learning would more view the latent space in spatial terms as a form of sculpting - e.g. https://a-desk.org/en/tema-del-mes/latent-space-ai-art/ ], through the process of corporate training [ the quality of large language model is based on the size of the company training the model [ The way those models are "disciplined" through tests, benchmarks and exams, is reminiscent of Michel Foucault's description of the modern school in disciplinary societies ] ], and, in turn, shape us into specific communities of perceiving. For instance, in which space is _Europe_ shaped into proximity with _Tourism_ rather than with _Colonialism_? In which space is _Technology_ surrounded with positive adjectives, rather than with negative co-notations?


CONCLUDE WITH AN EXAMPLE?


Bibliography:

- Latour, les vues de l'esprit

- Goody, the graphical reason

- Foucault, discipline and punish

- Deleuze, postscript on the societies of control

- Butler, gender trouble


images?

[ questions of literacy, of kinds of readings, legibility ]

[ communities of perceiving ]