On the Approach to Physics, DeLanda

Jude H
8 min readJan 12, 2022

--

Contradiction and namely disunity must naturally be present in every field of study, including that of “science”, but even science has been developed throughout history as an individual field, as a totality defined by an essence. This makes it incompatible with a flat and experimental ontology as the essentialism and typology that characterizes science prevents an accurate assessment of scientific development as experimental, disunified, and heterogeneous.

A good place to begin would be with the notion of fundamental laws. In the study of physics, specifically the field of classical mechanics, Newton’s laws are treated as truths or presuppositions of which everything else follows mechanically in logical deduction. This ties back into essentialism as singularities are viewed in categories and the productive processes and causal connections in which these singularities arise are ignored. Causal connections subvert typical understandings of deduction as they signify that causes produce their effects. Common philosophy of science has been ruled by laws of constant regularities, understanding causality characterized by linearity. This means that effects are followed mechanically from their causes, governed by exceptionless and general/fundamental laws that conceive as the same effect predicted or bound to be produced from a particular cause. These laws are also typically linguistically represented and subordinated, only furthering the essentialist approach to classical physics. Linear causality results in the response of a system being unproblematic, whereas models of non-linear and statistical causality consistently leave things unexplained while showing autonomy in self-organization. Fixed, homogeneous, and inert representations of matter would be exemplary of linear causality, adopted from Aristotle. Aristotle, after all, is the philosopher of essence, and the submission of matter to form deduction is coherent with his philosophy. Starting with the form, Deleuze, in Difference and Repetition and A Thousand Plateaus, advocates for a materialist energeticism in the movement of matter, adding “variable intensive affects” (ATP 408) to the formal essence. This makes it a perversion of essentialism rather than a simple opposition. DeLanda concludes:

“Additivity and externality presuppose, as i said, a matter obedient to laws and constituting an inert receptacle for forms imposed from the outside. Matter under non-linear and non-equilibrium conditions is, on the other hand, intensive and problematic, capable of spontaneously given rise to form drawing on its inherent tendencies (defined by singularities) as well as its complex capacities to affect and be affected” (ISVP 170)

The common approach dismisses the productive and dynamic causes and processes in favor of a turn towards constant regularity/ies. The constant-conjunction approach to causation, stating that A caused B is not to say that A produced B, but rather that A is followed by B, was first postulated by David Hume and mirrored in the Post-Newtonian attitude in physics. Scientists began to only seek regularities in nature, rather than seeking out causes and posing problems (experimental exploration). Essentially, these scientists attempt to deduce explanations from general regularities, with the goal of attaining universal truths and deducing as many phenomena as possible from fundamental principles [1]. Ronald N. Giere, author of Explaining Science: A Cognitive Approach, gives an example of the dimensions of the pendulum. The reduction of a two-dimensional pendulum to a one-dimensional model through logical and judicious deduction from Newton’s laws is precisely this case.

The truth is that the complexity of the practice of physics is far too intricate and convoluted to be gone about in a deductive-nomological approach, reduced in logical relations and deductions. In the deductive-nomological approach, explanation is treated as a deduction from a set of propositions. We begin with a linguistically expressed law and a set of propositions, later to be tested for validity or falsity in a laboratory. This finishes by the subsuming of phenomena under a general category. The axiomatic that structures the scientific field in this fashion leads to a deduction that transmits truth or falsity, and in this sense axioms are essences, as the truth deduced must have already been contained in the axioms. [2] In Difference and Repetition, Gilles Deleuze presents the Kantian concept of ‘shortest distance” as a schema that determines space in accordance with the concept. This concept is akin to what is detailed above, brought up by Manuel DeLanda and Morris Kline as “minimum principles”, the principles of laws that phenomena is related back to and ordered to satisfy. This is the way in which the shortest distance conditions thought and restricts concept creation and experimentation. Deleuzian ontology is exemplary of everything advocated for and critiqued here as axiomatic truths are replaced by problems, and logico-linguistically deduced propositions molded and connected to Euclidean geometry are replaced by singularities and affects.

Scientific study must be populated by models. Models that do not represent completion, unity, or finality, but rather as the product of historical experimentation, accumulation, and interaction, as well as forces subject to all the contingencies of said historical processes. In this way, fields must not be given closure and must make up a non-axiomatic population of models. Fundamental laws or axioms only attain generality when they give up accuracy, as the conditions for its generality necessitate that all else be held equal. The population approach provides an increase in description accuracy of phenomenon and complex causal interactions in classical mechanics, as well as a loss of generality. Physicists have been known to use statistical models to organize data instead of experimentation in which axioms are directly confronted with raw data. This direct confrontation is an active causal intervention, whereas the former is passive observation, unifying and organizing the population of models. Teleological, goal-seeking, finite, and efficient approaches to science should be avoided.

The approach Manuel DeLanda puts forth in Intensive Science and Virtual Philosophy, with the help of Deleuzian ontology would be one that acknowledges the heterogeneous and variable population of scientific models as well as the productive relations between phenomena. The privileging of linguistic truth must also be eliminated, and the correspondence between mathematical models and linguistic laws must be bridged. The approach holds the task of well-posed problems, a problematic approach instead of an axiomatic one.

“For problems-ideas are by nature unconscious: they are extra propositional and sub representative, and do not represent the affirmations to which they give rise” (DR 267)

This distribution takes place within an unessential multiplicity. Newton's laws then did not establish and discover an objective and general truth but rather a well-posed problem.

“These questions are those of the accident, the event, the multiplicity” (DR 188)

The problem, however, should not be subordinated to its solution or the possibility of solvability. A properly problematic question necessarily remains linguistic but explains the reasons for phenomena occurring not in the mere description of regularities. This makes them more accurate in the assessment and distribution of relevant/irrelevant factors of phenomena. The same question posed about phenomena by two different individuals can have entirely different meanings in seeking relative to themselves. This difference arises in what Alan Garfinkle calls “contrast spaces”, akin to “state spaces” in physics. The contrast/state space is one taken up by possibilities, whether being geometric in physics or decisive in an average everyday question posed. This makes it “metalinguistic”.

“In a typical nonlinear state space, subdivided by multiple attractors and their basins of. attraction, the structure of the space of possibilities depends not on some extrinsically defined relation but on the distribution of singularities itself. The trajectories in state space, defining possible sequences of states, are spontaneously broken into equivalence classes by the basins of attraction: if the starting point or initial condition of two different trajectories falls within a within a given basin both trajectories are bound to end up in the same state, and are equivalent in that respect.” (ISVP 160)

The “central” thesis here is that the approach attempting an accurate distribution of singularities manages to pervert the notion of objectivity while posing true and false problems. The determination of badly-posed questions lies in the contrast space, on its vagueness/indeterminability, or rigidness/overdetermination. Overdetermination typically holds too many alternatives within the question and can result in the inclusion of irrelevant ones. Determining relevance/irrelevance in the contrast spaces of questions is what's important here. Contrast spaces and presuppositions are the key to unlocking differentiation and objective validity in question creation. Delanda gives the example of a convection cell and its cyclic behavior, where irrelevant “micro-causal” descriptions (like individual molecules colliding with each other) may be raised. The proper explanation of this phenomenon rests in macro-causal ideas such as temperature and density gradients and gravitational competition between forces.

“Causal problems should be framed at the correct level given that each emergent level has its own causal capacities, these capacities being what differentiates these individuals from each others” (ISVP 162).

Specific to classical physics, laws are expressed in differential equations. The physical quantities making up the dimensions of state spaces are presupposed, and the distribution of singularities (possibilities) in accordance with the basins of attraction make up the contrast space.

This all undoubtedly has large ties to Deleuze’s ontology and epistemology. In thinking of population, some phenomena establish causal relations with events in the actual, others establishing quasi-causal relations with singularities in the virtual. In the bridge between explanatory problems and individual solutions (former being a virtual counterpart and latter actual), questions and problems can be explicated as they give rise to a multiplicity of solutions: they explicate the virtual in reality. This must be isomorphic in nature, and instead of physics being conceived of as a field producing true linguistic propositions mirroring reality, the two planes would be bridged and similar. There is no correspondence, representation, or conformance between the planes, but can individuality still be maintained without preventing such isomorphism?

“The philosopher must become isomorphic with the quasi-causal operator, extracting problems from law-expressing propositions and meshing the problems together to endow them with that minimum of autonomy which ensures their irreducibility to their solutions” (ISVT 165)

Another key in the model Delanda sets out is the move beyond the study of proprieties to the study of capacities. Individuality must be harnessed through a process of individuation that leaves no room for the static, in which we establish individuals in the actual through the observation of their affects/affected in a heterogenous assemblage accompanied by other entities. We must isolate the proprieties (partial objects) that are desirable for use in discovery as to innovate more experimentally and in a much more accurate method. Scientific knowledge should be established through an interactive multiplicity of relations. Science progressively learns to make distinctions between relevant/irrelevant, and places its accuracy in a state of flux, and the subject in a state of open-endedness.

Notes:

[1] It is also worth noting that this approach also has its links to Réné Descartes, who hoped that the laws of the universe in scientific study would be derived from a unified law/truth of the universe.

[2] Verificationist theories of meaning not only desire the truth of a statement but also entirely disregard the statement if the truth can not be tested by their means. This is a weakness.

Works Cited:

DR: Gilles Deleuze, Difference and Repetition, Columbia University Press, 1995

ISVP: Manuel DeLanda, Intensive Science and Virtual Philosophy, Bloomsbury Academic; Revised ed. Edition, 2013

ATP: Gilles Deleuze and Félix Guattari, A Thousand Plateaus: Capitalism and Schizophrenia Volume 2, University of Minnesota Press, 1987

--

--