Topic: Mathematics (Page 18)
You are looking at all articles with the topic "Mathematics". We found 224 matches.
Hint:
To view all topics, click here. Too see the most popular topics, click here instead.
π Tendril perversion β spontaneous symmetry breaking, uncoiling helical structures
Tendril perversion, often referred to in context as simply perversion, is a geometric phenomenon found in helical structures such as plant tendrils, in which a helical structure forms that is divided into two sections of opposite chirality, with a transition between the two in the middle. A similar phenomenon can often be observed in kinked helical cables such as telephone handset cords.
The phenomenon was known to Charles Darwin, who wrote in 1865,
A tendril ... invariably becomes twisted in one part in one direction, and in another part in the opposite direction... This curious and symmetrical structure has been noticed by several botanists, but has not been sufficiently explained.
The term "tendril perversion" was coined by Goriely and Tabor in 1998 based on the word perversion found in the 19th Century science literature. "Perversion" is a transition from one chirality to another and was known to James Clerk Maxwell, who attributed it to the topologist J. B. Listing.
Tendril perversion can be viewed as an example of spontaneous symmetry breaking, in which the strained structure of the tendril adopts a configuration of minimum energy while preserving zero overall twist.
Tendril perversion has been studied both experimentally and theoretically. Gerbode et al. have made experimental studies of the coiling of cucumber tendrils. A detailed study of a simple model of the physics of tendril perversion was made by MacMillen and Goriely in the early 2000s. Liu et al. showed in 2014 that "the transition from a helical to a hemihelical shape, as well as the number of perversions, depends on the height to width ratio of the strip's cross-section."
Generalized tendril perversions were put forward by Silva et al., to include perversions that can be intrinsically produced in elastic filaments, leading to a multiplicity of geometries and dynamical properties.
Discussed on
- "Tendril perversion β spontaneous symmetry breaking, uncoiling helical structures" | 2016-04-19 | 23 Upvotes 5 Comments
π Bernoulli Family
The Bernoulli family (German pronunciation: [bΙΚΛnΚli]) of Basel was a patrician family, notable for having produced eight mathematically gifted academics who, among them, contributed substantially to the development of mathematics and physics during the early modern period.
Discussed on
- "Bernoulli Family" | 2023-06-17 | 22 Upvotes 5 Comments
π Von Neumann's Elephant
Von Neumann's elephant is a problem in recreational mathematics, consisting of constructing a planar curve in the shape of an elephant from only four fixed parameters. It originated from a discussion between physicists John von Neumann and Enrico Fermi.
Discussed on
- "Von Neumann's Elephant" | 2024-04-13 | 21 Upvotes 5 Comments
π Nash equilibrium
In game theory, the Nash equilibrium, named after the mathematician John Forbes Nash Jr., is a proposed solution of a non-cooperative game involving two or more players in which each player is assumed to know the equilibrium strategies of the other players, and no player has anything to gain by changing only their own strategy.
In terms of game theory, if each player has chosen a strategy, and no player can benefit by changing strategies while the other players keep theirs unchanged, then the current set of strategy choices and their corresponding payoffs constitutes a Nash equilibrium.
Stated simply, Alice and Bob are in Nash equilibrium if Alice is making the best decision she can, taking into account Bob's decision while his decision remains unchanged, and Bob is making the best decision he can, taking into account Alice's decision while her decision remains unchanged. Likewise, a group of players are in Nash equilibrium if each one is making the best decision possible, taking into account the decisions of the others in the game as long as the other parties' decisions remain unchanged.
Nash showed that there is a Nash equilibrium for every finite game: see further the article on strategy.
Discussed on
- "Nash equilibrium" | 2018-10-11 | 20 Upvotes 6 Comments
π Type I and type II errors
In statistical hypothesis testing, a type I error is the rejection of a true null hypothesis (also known as a "false positive" finding or conclusion), while a type II error is the non-rejection of a false null hypothesis (also known as a "false negative" finding or conclusion). Much of statistical theory revolves around the minimization of one or both of these errors, though the complete elimination of either is a statistical impossibility for non-deterministic algorithms. By selecting a low threshold (cut-off) value and modifying the alpha (p) level, the quality of the hypothesis test can be increased. The knowledge of Type I errors and Type II errors is widely used in medical science, biometrics and computer science.
Discussed on
- "Type I and type II errors" | 2014-04-26 | 17 Upvotes 8 Comments
π ItΓ΄ Calculus
ItΓ΄ calculus, named after Kiyosi ItΓ΄, extends the methods of calculus to stochastic processes such as Brownian motion (see Wiener process). It has important applications in mathematical finance and stochastic differential equations.
The central concept is the ItΓ΄ stochastic integral, a stochastic generalization of the RiemannβStieltjes integral in analysis. The integrands and the integrators are now stochastic processes:
where H is a locally square-integrable process adapted to the filtration generated by X (Revuz & Yor 1999, Chapter IV), which is a Brownian motion or, more generally, a semimartingale. The result of the integration is then another stochastic process. Concretely, the integral from 0 to any particular t is a random variable, defined as a limit of a certain sequence of random variables. The paths of Brownian motion fail to satisfy the requirements to be able to apply the standard techniques of calculus. So with the integrand a stochastic process, the ItΓ΄ stochastic integral amounts to an integral with respect to a function which is not differentiable at any point and has infinite variation over every time interval. The main insight is that the integral can be defined as long as the integrand H is adapted, which loosely speaking means that its value at time t can only depend on information available up until this time. Roughly speaking, one chooses a sequence of partitions of the interval from 0 to t and constructs Riemann sums. Every time we are computing a Riemann sum, we are using a particular instantiation of the integrator. It is crucial which point in each of the small intervals is used to compute the value of the function. The limit then is taken in probability as the mesh of the partition is going to zero. Numerous technical details have to be taken care of to show that this limit exists and is independent of the particular sequence of partitions. Typically, the left end of the interval is used.
Important results of ItΓ΄ calculus include the integration by parts formula and ItΓ΄'s lemma, which is a change of variables formula. These differ from the formulas of standard calculus, due to quadratic variation terms.
In mathematical finance, the described evaluation strategy of the integral is conceptualized as that we are first deciding what to do, then observing the change in the prices. The integrand is how much stock we hold, the integrator represents the movement of the prices, and the integral is how much money we have in total including what our stock is worth, at any given moment. The prices of stocks and other traded financial assets can be modeled by stochastic processes such as Brownian motion or, more often, geometric Brownian motion (see BlackβScholes). Then, the ItΓ΄ stochastic integral represents the payoff of a continuous-time trading strategy consisting of holding an amount Ht of the stock at time t. In this situation, the condition that H is adapted corresponds to the necessary restriction that the trading strategy can only make use of the available information at any time. This prevents the possibility of unlimited gains through clairvoyance: buying the stock just before each uptick in the market and selling before each downtick. Similarly, the condition that H is adapted implies that the stochastic integral will not diverge when calculated as a limit of Riemann sums (Revuz & Yor 1999, Chapter IV).
Discussed on
- "ItΓ΄ Calculus" | 2023-08-03 | 22 Upvotes 3 Comments
π A function that represents all primes
In number theory, a formula for primes is a formula generating the prime numbers, exactly and without exception. No such formula which is efficiently computable is known. A number of constraints are known, showing what such a "formula" can and cannot be.
Discussed on
- "A function that represents all primes" | 2019-10-05 | 16 Upvotes 8 Comments
π PΓ³lya conjecture
In number theory, the PΓ³lya conjecture stated that "most" (i.e., 50% or more) of the natural numbers less than any given number have an odd number of prime factors. The conjecture was posited by the Hungarian mathematician George PΓ³lya in 1919, and proved false in 1958 by C. Brian Haselgrove.
The size of the smallest counterexample is often used to show how a conjecture can be true for many cases, and still be false, providing an illustration for the strong law of small numbers.
Discussed on
- "PΓ³lya conjecture" | 2009-08-29 | 13 Upvotes 10 Comments
π John von Neumann
John von Neumann (; Hungarian: Neumann JΓ‘nos Lajos, pronouncedΒ [ΛnΙjmΙn ΛjaΛnoΚ ΛlΙjoΚ]; December 28, 1903Β β FebruaryΒ 8, 1957) was a Hungarian-American mathematician, physicist, computer scientist, engineer and polymath. Von Neumann was generally regarded as the foremost mathematician of his time and said to be "the last representative of the great mathematicians"; who integrated both pure and applied sciences.
He made major contributions to a number of fields, including mathematics (foundations of mathematics, functional analysis, ergodic theory, representation theory, operator algebras, geometry, topology, and numerical analysis), physics (quantum mechanics, hydrodynamics, and quantum statistical mechanics), economics (game theory), computing (Von Neumann architecture, linear programming, self-replicating machines, stochastic computing), and statistics.
He was a pioneer of the application of operator theory to quantum mechanics in the development of functional analysis, and a key figure in the development of game theory and the concepts of cellular automata, the universal constructor and the digital computer.
He published over 150 papers in his life: about 60 in pure mathematics, 60 in applied mathematics, 20 in physics, and the remainder on special mathematical subjects or non-mathematical ones. His last work, an unfinished manuscript written while he was in hospital, was later published in book form as The Computer and the Brain.
His analysis of the structure of self-replication preceded the discovery of the structure of DNA. In a short list of facts about his life he submitted to the National Academy of Sciences, he stated, "The part of my work I consider most essential is that on quantum mechanics, which developed in GΓΆttingen in 1926, and subsequently in Berlin in 1927β1929. Also, my work on various forms of operator theory, Berlin 1930 and Princeton 1935β1939; on the ergodic theorem, Princeton, 1931β1932."
During World War II, von Neumann worked on the Manhattan Project with theoretical physicist Edward Teller, mathematician StanisΕaw Ulam and others, problem solving key steps in the nuclear physics involved in thermonuclear reactions and the hydrogen bomb. He developed the mathematical models behind the explosive lenses used in the implosion-type nuclear weapon, and coined the term "kiloton" (of TNT), as a measure of the explosive force generated.
After the war, he served on the General Advisory Committee of the United States Atomic Energy Commission, and consulted for a number of organizations, including the United States Air Force, the Army's Ballistic Research Laboratory, the Armed Forces Special Weapons Project, and the Lawrence Livermore National Laboratory. As a Hungarian Γ©migrΓ©, concerned that the Soviets would achieve nuclear superiority, he designed and promoted the policy of mutually assured destruction to limit the arms race.
Discussed on
- "John von Neumann" | 2015-06-26 | 20 Upvotes 3 Comments
π Abstract Nonsense
In mathematics, abstract nonsense, general abstract nonsense, generalized abstract nonsense, and general nonsense are nonderogatory terms used by mathematicians to describe long, theoretical parts of a proof they skip over when readers are expected to be familiar with them. These terms are mainly used for abstract methods related to category theory and homological algebra. More generally, "abstract nonsense" may refer to a proof that relies on category-theoretic methods, or even to the study of category theory itself.
Discussed on
- "Abstract Nonsense" | 2023-08-26 | 19 Upvotes 4 Comments