Random Articles (Page 6)
Have a deep view into what people are curious about.
馃敆 Accelerationism
In political and social theory, accelerationism is the idea that capitalism, or particular processes that historically characterised capitalism, should be accelerated instead of overcome in order to generate radical social change. "Accelerationism" may also refer more broadly, and usually pejoratively, to support for the intensification of capitalism in the belief that this will hasten its self-destructive tendencies and ultimately lead to its collapse.
Some contemporary accelerationist philosophy starts with the Deleuzo鈥揋uattarian theory of deterritorialisation, aiming to identify and radicalise the social forces that promote this emancipatory process.
Accelerationist theory has been divided into mutually contradictory left-wing and right-wing variants. "Left-accelerationism" attempts to press "the process of technological evolution" beyond the constrictive horizon of capitalism, for example by repurposing modern technology for socially beneficial and emancipatory ends; "right-accelerationism" supports the indefinite intensification of capitalism itself, possibly in order to bring about a technological singularity. Accelerationist writers have additionally distinguished other variants, such as "unconditional accelerationism".
Discussed on
- "Accelerationism" | 2019-09-26 | 97 Upvotes 129 Comments
馃敆 Functional Fixedness
Functional fixedness is a cognitive bias that limits a person to use an object only in the way it is traditionally used. The concept of functional fixedness originated in Gestalt psychology, a movement in psychology that emphasizes holistic processing. Karl Duncker defined functional fixedness as being a "mental block against using an object in a new way that is required to solve a problem". This "block" limits the ability of an individual to use components given to them to complete a task, as they cannot move past the original purpose of those components. For example, if someone needs a paperweight, but they only have a hammer, they may not see how the hammer can be used as a paperweight. Functional fixedness is this inability to see a hammer's use as anything other than for pounding nails; the person couldn't think to use the hammer in a way other than in its conventional function.
When tested, 5-year-old children show no signs of functional fixedness. It has been argued that this is because at age 5, any goal to be achieved with an object is equivalent to any other goal. However, by age 7, children have acquired the tendency to treat the originally intended purpose of an object as special.
Discussed on
- "Functional Fixedness" | 2015-08-02 | 34 Upvotes 26 Comments
馃敆 P贸lya Urn Model
In statistics, a P贸lya urn model (also known as a P贸lya urn scheme or simply as P贸lya's urn), named after George P贸lya, is a type of statistical model used as an idealized mental exercise framework, unifying many treatments.
In an urn model, objects of real interest (such as atoms, people, cars, etc.) are represented as colored balls in an urn or other container. In the basic P贸lya urn model, the urn contains x white and y black balls; one ball is drawn randomly from the urn and its color observed; it is then returned in the urn, and an additional ball of the same color is added to the urn, and the selection process is repeated. Questions of interest are the evolution of the urn population and the sequence of colors of the balls drawn out.
This endows the urn with a self-reinforcing property sometimes expressed as the rich get richer.
Note that in some sense, the P贸lya urn model is the "opposite" of the model of sampling without replacement, where every time a particular value is observed, it is less likely to be observed again, whereas in a P贸lya urn model, an observed value is more likely to be observed again. In both of these models, the act of measurement has an effect on the outcome of future measurements. (For comparison, when sampling with replacement, observation of a particular value has no effect on how likely it is to observe that value again.) In a P贸lya urn model, successive acts of measurement over time have less and less effect on future measurements, whereas in sampling without replacement, the opposite is true: After a certain number of measurements of a particular value, that value will never be seen again.
One of the reasons for interest in this particular rather elaborate urn model (i.e. with duplication and then replacement of each ball drawn) is that it provides an example in which the count (initially x black and y white) of balls in the urn is not concealed, which is able to approximate the correct updating of subjective probabilities appropriate to a different case in which the original urn content is concealed while ordinary sampling with replacement is conducted (without the P贸lya ball-duplication). Because of the simple "sampling with replacement" scheme in this second case, the urn content is now static, but this greater simplicity is compensated for by the assumption that the urn content is now unknown to an observer. A Bayesian analysis of the observer's uncertainty about the urn's initial content can be made, using a particular choice of (conjugate) prior distribution. Specifically, suppose that an observer knows that the urn contains only identical balls, each coloured either black or white, but he does not know the absolute number of balls present, nor the proportion that are of each colour. Suppose that he holds prior beliefs about these unknowns: for him the probability distribution of the urn content is well approximated by some prior distribution for the total number of balls in the urn, and a beta prior distribution with parameters (x,y) for the initial proportion of these which are black, this proportion being (for him) considered approximately independent of the total number. Then the process of outcomes of a succession of draws from the urn (with replacement but without the duplication) has approximately the same probability law as does the above P贸lya scheme in which the actual urn content was not hidden from him. The approximation error here relates to the fact that an urn containing a known finite number m of balls of course cannot have an exactly beta-distributed unknown proportion of black balls, since the domain of possible values for that proportion are confined to being multiples of , rather than having the full freedom to assume any value in the continuous unit interval, as would an exactly beta distributed proportion. This slightly informal account is provided for reason of motivation, and can be made more mathematically precise.
This basic P贸lya urn model has been enriched and generalized in many ways.
Discussed on
- "P贸lya Urn Model" | 2022-03-18 | 59 Upvotes 3 Comments
馃敆 Laws of Power: Machiavelli and Sun-Tzu brought up-to-date
The 48 Laws of Power (1998) is a non-fiction book by American author Robert Greene. The book is a bestseller, selling over 1.2 million copies in the United States, and is popular with prison inmates and celebrities.
Discussed on
- "Laws of Power: Machiavelli and Sun-Tzu brought up-to-date" | 2009-10-27 | 27 Upvotes 19 Comments
馃敆 Type 3 Diabetes (Alzheimer's)
Type 3 diabetes is a proposed term to describe the interlinked association between type 1 and type 2 diabetes, and Alzheimer's disease. This term is used to look into potential triggers of Alzheimer's disease in people with diabetes.
The proposed progression from diabetes to Alzheimer's disease is inadequately understood; however there are a number of hypotheses describing potential links between the two diseases. The internal mechanism of Insulin resistance and other metabolic risk factors such as hyperglycaemia, caused by oxidative stress and lipid peroxidation are common processes thought to be contributors to the development of Alzheimer's disease in diabetics.
Diagnosis for this disease is different between patients with type 1 and type 2 diabetes. Type 1 diabetes is usually discovered in children and adolescence while type 2 diabetic patients are often diagnosed later in life. While Type 3 diabetes is not a diagnosis in itself, a diagnosis of suspected Alzheimer's disease can be established through observational signs and sometimes with neuroimaging techniques such as Magnetic Resonance Imaging (MRI) to observe abnormalities in diabetic patient's brain tissue.
The techniques used to prevent the disease in patients with diabetes are similar to individuals who do not show signs of the disease. The four pillars of Alzheimer's disease prevention is currently used as a guide for individuals of whom are at risk of developing Alzheimer's disease.
Research into the effectiveness of Glucagon-like Peptide 1 and Melatonin administration to manage the progression of Alzheimer's disease in diabetic patients is currently being conducted to decrease the rate at which Alzheimer's disease progresses.
Labelling Alzheimer's disease as Type 3 Diabetes is generally controversial, and this definition is not a known medical diagnosis. While insulin resistance is a risk factor for the development of Alzheimer's disease and some other dementias, causes of Alzheimer's disease are likely to be much more complex than being explained by insulin factors on their own, and indeed several patients with Alzheimer's disease have normal insulin metabolism.
馃敆 Top 500 supercomputers by processor family
A supercomputer is a computer with a high level of performance as compared to a general-purpose computer. The performance of a supercomputer is commonly measured in floating-point operations per second (FLOPS) instead of million instructions per second (MIPS). Since 2017, there are supercomputers which can perform over a hundred quadrillion FLOPS (petaFLOPS). Since November 2017, all of the world's fastest 500 supercomputers run Linux-based operating systems. Additional research is being conducted in China, the United States, the European Union, Taiwan and Japan to build faster, more powerful and technologically superior exascale supercomputers.
Supercomputers play an important role in the field of computational science, and are used for a wide range of computationally intensive tasks in various fields, including quantum mechanics, weather forecasting, climate research, oil and gas exploration, molecular modeling (computing the structures and properties of chemical compounds, biological macromolecules, polymers, and crystals), and physical simulations (such as simulations of the early moments of the universe, airplane and spacecraft aerodynamics, the detonation of nuclear weapons, and nuclear fusion). They have been essential in the field of cryptanalysis.
Supercomputers were introduced in the 1960s, and for several decades the fastest were made by Seymour Cray at Control Data Corporation (CDC), Cray Research and subsequent companies bearing his name or monogram. The first such machines were highly tuned conventional designs that ran faster than their more general-purpose contemporaries. Through the decade, increasing amounts of parallelism were added, with one to four processors being typical. From the 1970s, vector processors operating on large arrays of data came to dominate. A notable example is the highly successful Cray-1 of 1976. Vector computers remained the dominant design into the 1990s. From then until today, massively parallel supercomputers with tens of thousands of off-the-shelf processors became the norm.
The US has long been the leader in the supercomputer field, first through Cray's almost uninterrupted dominance of the field, and later through a variety of technology companies. Japan made major strides in the field in the 1980s and 90s, with China becoming increasingly active in the field. As of November 2018, the fastest supercomputer on the TOP500 supercomputer list is the Summit, in the United States, with a LINPACK benchmark score of 143.5聽PFLOPS, followed by, Sierra, by around 48.860聽PFLOPS. The US has five of the top 10 and China has two. In June 2018, all supercomputers on the list combined broke the 1 exaFLOPS mark.
Discussed on
- "Top 500 supercomputers by processor family" | 2019-09-24 | 57 Upvotes 29 Comments
馃敆 65537-gon
In geometry, a 65537-gon is a polygon with 65,537 (216 + 1) sides. The sum of the interior angles of any non鈥搒elf-intersecting 65537-gon is 11796300掳.
Discussed on
- "65537-gon" | 2025-04-15 | 14 Upvotes 3 Comments
馃敆 First-Mover Advantage
In marketing strategy, first-mover advantage (FMA) is the advantage gained by the initial ("first-moving") significant occupant of a market segment. First-mover advantage may be gained by technological leadership, or early purchase of resources.
A market participant has first-mover advantage if it is the first entrant and gains a competitive advantage through control of resources. With this advantage, first-movers can be rewarded with huge profit margins and a monopoly-like status.
Not all first-movers are rewarded. If the first-mover does not capitalize on its advantage, its "first-mover disadvantages" leave opportunity for new entrants to enter the market and compete more effectively and efficiently than the first-movers; such firms have "second-mover advantage".
Discussed on
- "First-Mover Advantage" | 2020-03-22 | 21 Upvotes 12 Comments
馃敆 Room 641A
Room 641A is a telecommunication interception facility operated by AT&T for the U.S. National Security Agency, as part of its warrantless surveillance program as authorized by the Patriot Act. The facility commenced operations in 2003 and its purpose was publicly revealed in 2006.
Discussed on
- "Room 641A" | 2024-09-11 | 51 Upvotes 5 Comments
- "Room 641A" | 2022-09-26 | 29 Upvotes 2 Comments
- "Room 641A" | 2020-05-29 | 333 Upvotes 70 Comments
- "Room 641A" | 2016-09-16 | 207 Upvotes 75 Comments
- "Room 641A" | 2013-06-09 | 248 Upvotes 44 Comments
馃敆 Principle of Least Astonishment
The principle of least astonishment (POLA), also called the principle of least surprise (alternatively a "law" or "rule") applies to user interface and software design. A typical formulation of the principle, from 1984, is: "If a necessary feature has a high astonishment factor, it may be necessary to redesign the feature."
More generally, the principle means that a component of a system should behave in a way that most users will expect it to behave; the behavior should not astonish or surprise users.
Discussed on
- "Principle of Least Astonishment" | 2020-05-05 | 22 Upvotes 11 Comments