mathstodon.xyz is one of the many independent Mastodon servers you can use to participate in the fediverse.
A Mastodon instance for maths people. We have LaTeX rendering in the web interface!

Server stats:

2.7K
active users

#analytic

0 posts0 participants0 posts today
Continued thread

We are now concerned with more radical possibilities.

A paradigmatic example is topology.

In modern “analytic topology”, a “space” is defined to be a set of points equipped with a collection of subsets called open,
which describe how the points vary continuously into each other.
(Most analytic topologists, being unaware of synthetic topology, would call their subject simply “topology.”)

By contrast, in synthetic topology we postulate instead an axiomatic theory, on the same ontological level as ZFC,
whose basic objects are spaces rather than sets.

Of course, by saying that the basic objects “are” spaces we do not mean that they are sets equipped with open subsets.

Instead we mean that “space” is an undefined word,
and the rules of the theory cause these “spaces” to behave more or less like we expect spaces to behave.

In particular, synthetic spaces have open subsets (or, more accurately, open subspaces),
but they are not defined by specifying a set together with a collection of open subsets.

It turns out that synthetic topology, like synthetic set theory (ZFC), is rich enough to encode all of mathematics.

There is one trivial sense in which this is true:
among all analytic spaces we find the subclass of indiscrete ones,
in which the only open subsets are the empty set and the whole space.

A notion of “indiscrete space” can also be defined in synthetic topology,
and the collection of such spaces forms a universe of ETCS-like sets
(we’ll come back to these in later installments).

Thus we could use them to encode mathematics, entirely ignoring the rest of the synthetic theory of spaces.
(The same could be said about the discrete spaces,
in which every subset is open;
but these are harder (though not impossible) to define and work with synthetically.

The relation between the discrete and indiscrete spaces,
and how they sit inside the synthetic theory of spaces,
is central to the synthetic theory of cohesion,
which I believe David is going to mention in his chapter about the philosophy of geometry.)

However, a less boring approach is to construct the objects of mathematics directly as spaces.

How does this work?
It turns out that the basic constructions on sets that we use to build (say) the set of real numbers have close analogues that act on spaces.

Thus, in synthetic topology we can use these constructions to build the space of real numbers directly.

If our system of synthetic topology is set up well,
then the resulting space will behave like the analytic space of real numbers
(the one that is defined by first constructing the mere set of real numbers and then equipping it with the unions of open intervals as its topology).

The next question is,
why would we want to do mathematics this way?

There are a lot of reasons,
but right now I believe they can be classified into three sorts:
modularity,
philosophy, and
pragmatism.

(If you can think of other reasons that I’m forgetting, please mention them in the comments!)

By “#modularity” I mean the same thing as does a programmer:

even if we believe that spaces are ultimately built analytically out of sets,
it is often useful to isolate their fundamental properties and work with those abstractly.

One advantage of this is #generality.
For instance, any theorem proven in Euclid’s “neutral geometry”
(i.e. without using the parallel postulate)
is true not only in the model of ordered pairs of real numbers,
but also in the various non-Euclidean geometries.

Similarly, a theorem proven in synthetic topology may be true not only about ordinary topological spaces,
but also about other variant theories such as topological sheaves, smooth spaces, etc.

As always in mathematics, if we state only the assumptions we need, our theorems become more general.

Mike Shulman:

Mathematical theories can be classified as analytic or synthetic.

An #analytic theory is one that analyzes, or breaks down, its objects of study, revealing them as put together out of simpler things,
just as complex molecules are put together out of protons, neutrons, and electrons.

For example, analytic geometry analyzes the plane geometry of points, lines, etc. in terms of real numbers:
points are ordered pairs of real numbers, lines are sets of points, etc.

Mathematically, the basic objects of an analytic theory are defined in terms of those of some other theory.

By contrast, a #synthetic theory is one that synthesizes,
or puts together,
a conception of its basic objects based on their expected relationships and behavior.

For example, synthetic geometry is more like the geometry of Euclid:
points and lines are essentially undefined terms,
given meaning by the axioms that specify what we can do with them
(e.g. two points determine a unique line).

(Although Euclid himself attempted to define “point” and “line”,
modern mathematicians generally consider this a mistake,
and regard Euclid’s “definitions”
(like “a point is that which has no part”)
as fairly meaningless.)

Mathematically, a synthetic theory is a formal system governed by rules or axioms.

Synthetic mathematics can be regarded as analogous to foundational physics,
where a concept like the electromagnetic field is not “put together” out of anything simpler:
it just is, and behaves in a certain way.

The distinction between analytic and synthetic dates back at least to Hilbert,
who used the words “genetic” and “axiomatic” respectively.

At one level, we can say that modern mathematics is characterized by a rich interplay between analytic and synthetic
— although most mathematicians would speak instead of definitions and examples.

For instance, a modern geometer might define “a geometry” to satisfy Euclid’s axioms,
and then work synthetically with those axioms;
but she would also construct examples of such “geometries” analytically,
such as with ordered pairs of real numbers.

This approach was pioneered by Hilbert himself, who emphasized in particular that constructing an analytic example (or model) proves the consistency of the synthetic theory.

However, at a deeper level, almost all of modern mathematics is analytic, because it is all analyzed into set theory. Our modern geometer would not actually state her axioms the way that Euclid did; she would instead define a geometry to be a set
P of points together with a set
L of lines
and a subset of
P×L representing the “incidence” relation, etc.

From this perspective, the only truly undefined term in mathematics is “set”, and the only truly synthetic theory is Zermelo–Fraenkel set theory (ZFC).

This use of set theory as the common foundation for mathematics is, of course, of 20th century vintage,
and overall it has been a tremendous step forwards.

Practically, it provides a common language and a powerful basic toolset for all mathematicians.

Foundationally, it ensures that all of mathematics is consistent relative to set theory.

(Hilbert’s dream of an absolute consistency proof is generally considered to have been demolished by Gödel’s incompleteness theorem.)

And philosophically, it supplies a consistent ontology for mathematics, and a context in which to ask metamathematical questions.

However, ZFC is not the only theory that can be used in this way.
While not every synthetic theory is rich enough to allow all of mathematics to be encoded in it,
set theory is by no means unique in possessing such richness.

One possible variation is to use a different sort of set theory like ETCS,
in which the elements of a set are “featureless points” that are merely distinguished from each other,
rather than labeled individually by the elaborate hierarchical membership structures of ZFC.

Either sort of “set” suffices just as well for foundational purposes, and moreover each can be interpreted into the other.
golem.ph.utexas.edu/category/2

golem.ph.utexas.eduIntroduction to Synthetic Mathematics (part 1) | The n-Category Café

Whenever I walk to/from home, I have to walk up/down an inclined street; I noticed that the asphalt floor has different curvatures depending on how near it is of a bend, and I try to find a less steep incline while walking.

This got me inspiration for the few questions below. Any simple explanations, and related links, are welcome.

Given a surface within R^3, and two distinct points in it, there are infinitely many differentiable paths from one point to another, remaining on the surface. At each point of the , one can find the path's local . Then:

- Find a path that minimizes the supreme of the curvature. In other words, find the "flattest" path.

- Find a path that minimizes the variation of the curvature. In other words, find a path that "most resembles" a circle arc.

Are these tasks always possible within the given conditions? Are any stronger conditions needed? Are there cases with an solution, or are they possible only with numerical approximations?

One of the distinctions central to Immanuel Kant's work is that between analytic and synthetic. Here's a core concept video looking at his discussion of that early on in the Prolegomena To Any Future Metaphysics

youtu.be/390JHuDsMUw
#Video #Kant #Distinction #Philosophy #Analytic #Synthetic #Metaphysics #Judgements

Replied in thread

@katchwreck I agree that basic physical quantities can be defined through an integral, most notably the action integral leading to Laplace's equations. I also agree that an integral tends to smooth its integrand.

Have you considered that the integrands themselves are already smooth due to being analytic? The idea is that physics, at least classically, depends on analytic functions, which by definition are already the smoothest functions available.

There was a major effort in the early 60s to extend this idea to quantum field theory. It was called the analytic S-matrix program, and was the hottest thing in physics until quarks came along. The concepts behind the analytic S-matrix still appear from time to time when field theory gets too complicated to calculate. Given the "success" of string theory, who knows what the future holds for in .

150 (or so) Arguments for #Atheism

"A popular view in contemporary #analytic #philosophy of #religion is that while there are many arguments for #theism -- #cosmological, #ontological, and #teleological arguments; #moral arguments; arguments from #consciousness; etc. (...) -- there are only one or two arguments for atheism, viz., the problem of #evil and (more recently) the argument from #divine hiddenness.

This is a misconception. Here are well over 150"

exapologist.blogspot.com/2022/

@philosophy

exapologist.blogspot.com150 (or so) Arguments for AtheismA popular view in contemporary analytic philosophy of religion is that while there are many arguments for theism -- cosmological, ontologica...

#SoftMatter have just published the results of a project that Renato Assante, Davide Marenduzzo, Alexander Morozov, and I recently worked on together! What did we do and what’s new? Briefly…

#Microswimmer suspensions behave in a similar way to fluids containing kinesin and microtubules. Both systems can be described by the same system of three coupled nonlinear #PDEs.

A #LinearStabilityAnalysis of these equations suggests that variations in concentration across the system don’t significantly affect emergent #phaseBehaviour. How then can we explain #experiments that show visible inhomogeneities in #microtubule#kinesin mixtures, for instance?

With increasing activity, we move away from the quiescent regime, past the onset of #SpontaneousFlow, and deeper into the active phase, where #nonlinearities become more important. What role do concentration inhomogeneities play here?

We investigated these questions, taking advantage of the #openSource #Dedalus #spectral framework to simulate the full nonlinear time evolution. This led us to predict a #novel regime of #spontaneous #microphaseSeparation into active (nematically ordered) and passive domains.

Active flow arrests macrophase separation in this regime, counteracting domain coarsening due to thermodynamic coupling between active matter concentration and #nematic order. As a result, domains reach a characteristic size that decreases with increasing activity.

This regime is one part of the #PhaseDiagram we mapped out. Along with our other findings, you can read all about it here!

low #ReynoldsNumber #turbulence #ActiveTurbulence #CahnHilliard #ActiveMatter #NavierStokes #BerisEdwards #CondensedMatter #PhaseTransitions #TheoreticalPhysics #BioPhysics #StatisticalPhysics #FluidDynamics #ComputationalPhysics #Simulation #FieldTheory #paperthread #NewPaper #science #research #ActiveGel #activeNematic #analytic #cytoskeleton #hydrodynamics #MPI #theory

#SoftMatter have just published the results of a project that Renato Assante, Davide Marenduzzo, Alexander Morozov, and I recently worked on together! What did we do and what’s new? Briefly…

The #hydrodynamic behaviour of inhomogeneous #activeNematic gels (such as extensile bundles of #cytoskeletal filaments or suspensions of low #ReynoldsNumber swimmers) can be described by the time evolution of three coupled #PDEs.

Standard #ActiveGel #theory concludes, from a #LinearStabilityAnalysis of these equations, that fluctuations in concentration don’t significantly affect emergent #phaseBehaviour. However, this leaves #experimental #observations of visible inhomogeneities in #microtubule#kinesin mixtures unexplained. As we move away from the passive (quiescent) regime, past the onset of #SpontaneousFlow, and deeper into the active phase, #nonlinearities become more important. What role do concentration inhomogeneities play here?

Alongside #analytic techniques, we used an in-house #MPI-parallel code developed within the #Dedalus #spectral framework to investigate. We predict a #novel regime of #spontaneous #microphaseSeparation into active (nematically ordered) and passive domains. In this regime, active flow arrests macrophase separation, which is itself driven by the thermodynamic coupling between active matter concentration and #nematic order. As a result, domains do not #coarsen past a typical size, which decreases with increasing activity. This regime is one part of the #PhaseDiagram we mapped out.

Along with our other findings, you can read all about it here!

Replied in thread

@edendestroyer
It all depends, as they say in @philosophy , what you mean by "define".
Are you looking for a dictionary definition to replace those in OED or Webster? Are you establishing an axiom - along with axioms on "concepts", "information", "comprehend" (which are doing a lot of heavy lifting here) and some lemme, to derive something? Are you being polemical, and looking for an, ever popular, "it's true by definition" result? Do you think that meaning derives from definition?

Anybody coming up with Mastodon #data #analytic tools...

For instance:
Get a user's comparative toots-replies-boosts stats?
Do something similar for all members of a particular instance?
Who's writing?
Where are all of the boosts for a message coming from?

I'm about as far from a "quant" or coder as you can get, but curious about ways to imagine and describe this new collective community.