Topic: Mathematics (Page 6)

You are looking at all articles with the topic "Mathematics". We found 223 matches.

Hint: To view all topics, click here. Too see the most popular topics, click here instead.

🔗 Untouchable Number

🔗 Mathematics

An untouchable number is a positive integer that cannot be expressed as the sum of all the proper divisors of any positive integer (including the untouchable number itself). That is, these numbers are not in the image of the aliquot sum function. Their study goes back at least to Abu Mansur al-Baghdadi (circa 1000 AD), who observed that both 2 and 5 are untouchable.

Discussed on

🔗 68–95–99.7 Rule

🔗 Mathematics 🔗 Statistics

In statistics, the 68–95–99.7 rule, also known as the empirical rule, is a shorthand used to remember the percentage of values that lie within an interval estimate in a normal distribution: 68%, 95%, and 99.7% of the values lie within one, two, and three standard deviations of the mean, respectively.

In mathematical notation, these facts can be expressed as follows, where Pr() is the probability function, Χ is an observation from a normally distributed random variable, μ (mu) is the mean of the distribution, and σ (sigma) is its standard deviation:

Pr ( μ − 1 σ ≤ X ≤ μ + 1 σ ) ≈ 68.27 % Pr ( μ − 2 σ ≤ X ≤ μ + 2 σ ) ≈ 95.45 % Pr ( μ − 3 σ ≤ X ≤ μ + 3 σ ) ≈ 99.73 % {\displaystyle {\begin{aligned}\Pr(\mu -1\sigma \leq X\leq \mu +1\sigma )&\approx 68.27\%\\\Pr(\mu -2\sigma \leq X\leq \mu +2\sigma )&\approx 95.45\%\\\Pr(\mu -3\sigma \leq X\leq \mu +3\sigma )&\approx 99.73\%\end{aligned}}}

The usefulness of this heuristic especially depends on the question under consideration.

In the empirical sciences, the so-called three-sigma rule of thumb (or 3σ rule) expresses a conventional heuristic that nearly all values are taken to lie within three standard deviations of the mean, and thus it is empirically useful to treat 99.7% probability as near certainty.

In the social sciences, a result may be considered "significant" if its confidence level is of the order of a two-sigma effect (95%), while in particle physics, there is a convention of a five-sigma effect (99.99994% confidence) being required to qualify as a discovery.

A weaker three-sigma rule can be derived from Chebyshev's inequality, stating that even for non-normally distributed variables, at least 88.8% of cases should fall within properly calculated three-sigma intervals. For unimodal distributions, the probability of being within the interval is at least 95% by the Vysochanskij–Petunin inequality. There may be certain assumptions for a distribution that force this probability to be at least 98%.

Discussed on

🔗 Ménage Problem

🔗 Mathematics

In combinatorial mathematics, the ménage problem or problème des ménages asks for the number of different ways in which it is possible to seat a set of male-female couples at a round dining table so that men and women alternate and nobody sits next to his or her partner. This problem was formulated in 1891 by Édouard Lucas and independently, a few years earlier, by Peter Guthrie Tait in connection with knot theory. For a number of couples equal to 3, 4, 5, ... the number of seating arrangements is

12, 96, 3120, 115200, 5836320, 382072320, 31488549120, ... (sequence A059375 in the OEIS).

Mathematicians have developed formulas and recurrence equations for computing these numbers and related sequences of numbers. Along with their applications to etiquette and knot theory, these numbers also have a graph theoretic interpretation: they count the numbers of matchings and Hamiltonian cycles in certain families of graphs.

Discussed on

🔗 Secretary Problem

🔗 Mathematics 🔗 Statistics

The secretary problem is a problem that demonstrates a scenario involving optimal stopping theory. The problem has been studied extensively in the fields of applied probability, statistics, and decision theory. It is also known as the marriage problem, the sultan's dowry problem, the fussy suitor problem, the googol game, and the best choice problem.

The basic form of the problem is the following: imagine an administrator who wants to hire the best secretary out of n {\displaystyle n} rankable applicants for a position. The applicants are interviewed one by one in random order. A decision about each particular applicant is to be made immediately after the interview. Once rejected, an applicant cannot be recalled. During the interview, the administrator gains information sufficient to rank the applicant among all applicants interviewed so far, but is unaware of the quality of yet unseen applicants. The question is about the optimal strategy (stopping rule) to maximize the probability of selecting the best applicant. If the decision can be deferred to the end, this can be solved by the simple maximum selection algorithm of tracking the running maximum (and who achieved it), and selecting the overall maximum at the end. The difficulty is that the decision must be made immediately.

The shortest rigorous proof known so far is provided by the odds algorithm (Bruss 2000). It implies that the optimal win probability is always at least 1 / e {\displaystyle 1/e} (where e is the base of the natural logarithm), and that the latter holds even in a much greater generality (2003). The optimal stopping rule prescribes always rejecting the first ∼ n / e {\displaystyle \sim n/e} applicants that are interviewed and then stopping at the first applicant who is better than every applicant interviewed so far (or continuing to the last applicant if this never occurs). Sometimes this strategy is called the 1 / e {\displaystyle 1/e} stopping rule, because the probability of stopping at the best applicant with this strategy is about 1 / e {\displaystyle 1/e} already for moderate values of n {\displaystyle n} . One reason why the secretary problem has received so much attention is that the optimal policy for the problem (the stopping rule) is simple and selects the single best candidate about 37% of the time, irrespective of whether there are 100 or 100 million applicants.

Discussed on

🔗 Stochastic Resonance

🔗 Mathematics 🔗 Physics

Stochastic resonance (SR) is a phenomenon where a signal that is normally too weak to be detected by a sensor, can be boosted by adding white noise to the signal, which contains a wide spectrum of frequencies. The frequencies in the white noise corresponding to the original signal's frequencies will resonate with each other, amplifying the original signal while not amplifying the rest of the white noise (thereby increasing the signal-to-noise ratio which makes the original signal more prominent). Further, the added white noise can be enough to be detectable by the sensor, which can then filter it out to effectively detect the original, previously undetectable signal.

This phenomenon of boosting undetectable signals by resonating with added white noise extends to many other systems, whether electromagnetic, physical or biological, and is an area of research.

Discussed on

🔗 Norton's Dome

🔗 Mathematics 🔗 Physics 🔗 Philosophy

Norton's dome is a thought experiment that exhibits a non-deterministic system within the bounds of Newtonian mechanics. It was devised by John D. Norton in 2003. It is a special limiting case of a more general class of examples from 1997 due to Sanjay Bhat and Dennis Bernstein. The Norton's dome problem can be regarded as a problem in physics, mathematics, or philosophy.

Discussed on

🔗 Srinivasa Ramanujan

🔗 Biography 🔗 Mathematics 🔗 Biography/science and academia 🔗 History of Science 🔗 India 🔗 India/Indian history workgroup 🔗 India/Tamil Nadu

Srinivasa Ramanujan FRS (; listen ; 22 December 1887 – 26 April 1920) was an Indian mathematician who lived during the British Rule in India. Though he had almost no formal training in pure mathematics, he made substantial contributions to mathematical analysis, number theory, infinite series, and continued fractions, including solutions to mathematical problems then considered unsolvable. Ramanujan initially developed his own mathematical research in isolation: "He tried to interest the leading professional mathematicians in his work, but failed for the most part. What he had to show them was too novel, too unfamiliar, and additionally presented in unusual ways; they could not be bothered". Seeking mathematicians who could better understand his work, in 1913 he began a postal partnership with the English mathematician G. H. Hardy at the University of Cambridge, England. Recognizing Ramanujan's work as extraordinary, Hardy arranged for him to travel to Cambridge. In his notes, Ramanujan had produced groundbreaking new theorems, including some that Hardy said had "defeated him and his colleagues completely", in addition to rediscovering recently proven but highly advanced results.

During his short life, Ramanujan independently compiled nearly 3,900 results (mostly identities and equations). Many were completely novel; his original and highly unconventional results, such as the Ramanujan prime, the Ramanujan theta function, partition formulae and mock theta functions, have opened entire new areas of work and inspired a vast amount of further research. Nearly all his claims have now been proven correct. The Ramanujan Journal, a scientific journal, was established to publish work in all areas of mathematics influenced by Ramanujan, and his notebooks—containing summaries of his published and unpublished results—have been analyzed and studied for decades since his death as a source of new mathematical ideas. As late as 2011 and again in 2012, researchers continued to discover that mere comments in his writings about "simple properties" and "similar outputs" for certain findings were themselves profound and subtle number theory results that remained unsuspected until nearly a century after his death. He became one of the youngest Fellows of the Royal Society and only the second Indian member, and the first Indian to be elected a Fellow of Trinity College, Cambridge. Of his original letters, Hardy stated that a single look was enough to show they could only have been written by a mathematician of the highest calibre, comparing Ramanujan to mathematical geniuses such as Euler and Jacobi.

In 1919, ill health—now believed to have been hepatic amoebiasis (a complication from episodes of dysentery many years previously)—compelled Ramanujan's return to India, where he died in 1920 at the age of 32. His last letters to Hardy, written in January 1920, show that he was still continuing to produce new mathematical ideas and theorems. His "lost notebook", containing discoveries from the last year of his life, caused great excitement among mathematicians when it was rediscovered in 1976.

A deeply religious Hindu, Ramanujan credited his substantial mathematical capacities to divinity, and said the mathematical knowledge he displayed was revealed to him by his family goddess. "An equation for me has no meaning," he once said, "unless it expresses a thought of God."

Discussed on

🔗 Cantor function, a.k.a. devil's staircase: increasing function with 0 derivative

🔗 Mathematics 🔗 Systems 🔗 Systems/Chaos theory

In mathematics, the Cantor function is an example of a function that is continuous, but not absolutely continuous. It is a notorious counterexample in analysis, because it challenges naive intuitions about continuity, derivative, and measure. Though it is continuous everywhere and has zero derivative almost everywhere, its value still goes from 0 to 1 as its argument reaches from 0 to 1. Thus, in one sense the function seems very much like a constant one which cannot grow, and in another, it does indeed monotonically grow, by construction.

It is also referred to as the Cantor ternary function, the Lebesgue function, Lebesgue's singular function, the Cantor–Vitali function, the Devil's staircase, the Cantor staircase function, and the Cantor–Lebesgue function. Georg Cantor (1884) introduced the Cantor function and mentioned that Scheeffer pointed out that it was a counterexample to an extension of the fundamental theorem of calculus claimed by Harnack. The Cantor function was discussed and popularized by Scheeffer (1884), Lebesgue (1904) and Vitali (1905).

🔗 Seven-Dimensional Cross Product

🔗 Mathematics

In mathematics, the seven-dimensional cross product is a bilinear operation on vectors in seven-dimensional Euclidean space. It assigns to any two vectors a, b in R7 a vector a × b also in R7. Like the cross product in three dimensions, the seven-dimensional product is anticommutative and a × b is orthogonal both to a and to b. Unlike in three dimensions, it does not satisfy the Jacobi identity, and while the three-dimensional cross product is unique up to a sign, there are many seven-dimensional cross products. The seven-dimensional cross product has the same relationship to the octonions as the three-dimensional product does to the quaternions.

The seven-dimensional cross product is one way of generalising the cross product to other than three dimensions, and it is the only other bilinear product of two vectors that is vector-valued, orthogonal, and has the same magnitude as in the 3D case. In other dimensions there are vector-valued products of three or more vectors that satisfy these conditions, and binary products with bivector results.

Discussed on

🔗 Seven Bridges of Königsberg

🔗 Mathematics 🔗 Germany 🔗 Germany/Prussia

The Seven Bridges of Königsberg is a historically notable problem in mathematics. Its negative resolution by Leonhard Euler in 1736 laid the foundations of graph theory and prefigured the idea of topology.

The city of Königsberg in Prussia (now Kaliningrad, Russia) was set on both sides of the Pregel River, and included two large islands—Kneiphof and Lomse—which were connected to each other, or to the two mainland portions of the city, by seven bridges. The problem was to devise a walk through the city that would cross each of those bridges once and only once.

By way of specifying the logical task unambiguously, solutions involving either

  1. reaching an island or mainland bank other than via one of the bridges, or
  2. accessing any bridge without crossing to its other end

are explicitly unacceptable.

Euler proved that the problem has no solution. The difficulty he faced was the development of a suitable technique of analysis, and of subsequent tests that established this assertion with mathematical rigor.

Discussed on