Random Articles (Page 6)

Have a deep view into what people are curious about.

πŸ”— Royal Mail Rubber Band

πŸ”— Philately

A Royal Mail rubber band is a small red elastic loop used by the postal delivery service in the United Kingdom. In the course of its work, the Royal Mail consumes nearly one billion rubber bands per year to tie together bundles of letters at sorting offices. In the 2000s, complaints about Royal Mail rubber bands littering the streets of Britain gave rise to ongoing press interest in this minor cultural phenomenon. The Royal Mail no longer uses red rubber bands.

Discussed on

πŸ”— Kernel Embedding of Distributions

πŸ”— Computer science

In machine learning, the kernel embedding of distributions (also called the kernel mean or mean map) comprises a class of nonparametric methods in which a probability distribution is represented as an element of a reproducing kernel Hilbert space (RKHS). A generalization of the individual data-point feature mapping done in classical kernel methods, the embedding of distributions into infinite-dimensional feature spaces can preserve all of the statistical features of arbitrary distributions, while allowing one to compare and manipulate distributions using Hilbert space operations such as inner products, distances, projections, linear transformations, and spectral analysis. This learning framework is very general and can be applied to distributions over any space Ξ© {\displaystyle \Omega } on which a sensible kernel function (measuring similarity between elements of Ξ© {\displaystyle \Omega } ) may be defined. For example, various kernels have been proposed for learning from data which are: vectors in R d {\displaystyle \mathbb {R} ^{d}} , discrete classes/categories, strings, graphs/networks, images, time series, manifolds, dynamical systems, and other structured objects. The theory behind kernel embeddings of distributions has been primarily developed by Alex Smola, Le Song , Arthur Gretton, and Bernhard SchΓΆlkopf. A review of recent works on kernel embedding of distributions can be found in.

The analysis of distributions is fundamental in machine learning and statistics, and many algorithms in these fields rely on information theoretic approaches such as entropy, mutual information, or Kullback–Leibler divergence. However, to estimate these quantities, one must first either perform density estimation, or employ sophisticated space-partitioning/bias-correction strategies which are typically infeasible for high-dimensional data. Commonly, methods for modeling complex distributions rely on parametric assumptions that may be unfounded or computationally challenging (e.g. Gaussian mixture models), while nonparametric methods like kernel density estimation (Note: the smoothing kernels in this context have a different interpretation than the kernels discussed here) or characteristic function representation (via the Fourier transform of the distribution) break down in high-dimensional settings.

Methods based on the kernel embedding of distributions sidestep these problems and also possess the following advantages:

  1. Data may be modeled without restrictive assumptions about the form of the distributions and relationships between variables
  2. Intermediate density estimation is not needed
  3. Practitioners may specify the properties of a distribution most relevant for their problem (incorporating prior knowledge via choice of the kernel)
  4. If a characteristic kernel is used, then the embedding can uniquely preserve all information about a distribution, while thanks to the kernel trick, computations on the potentially infinite-dimensional RKHS can be implemented in practice as simple Gram matrix operations
  5. Dimensionality-independent rates of convergence for the empirical kernel mean (estimated using samples from the distribution) to the kernel embedding of the true underlying distribution can be proven.
  6. Learning algorithms based on this framework exhibit good generalization ability and finite sample convergence, while often being simpler and more effective than information theoretic methods

Thus, learning via the kernel embedding of distributions offers a principled drop-in replacement for information theoretic approaches and is a framework which not only subsumes many popular methods in machine learning and statistics as special cases, but also can lead to entirely new learning algorithms.

Discussed on

πŸ”— Project Habakkuk, Britain's plan to build an aircraft carrier from ice

πŸ”— Technology πŸ”— Military history πŸ”— Military history/Military aviation πŸ”— Military history/North American military history πŸ”— Military history/Military science, technology, and theory πŸ”— Military history/Weaponry πŸ”— Canada πŸ”— Architecture πŸ”— United Kingdom πŸ”— Military history/Maritime warfare πŸ”— Military history/World War II πŸ”— Engineering πŸ”— Ships πŸ”— Military history/Canadian military history πŸ”— Military history/European military history πŸ”— Military history/British military history

Project Habakkuk or Habbakuk (spelling varies) was a plan by the British during the Second World War to construct an aircraft carrier out of pykrete (a mixture of wood pulp and ice) for use against German U-boats in the mid-Atlantic, which were beyond the flight range of land-based planes at that time. The idea came from Geoffrey Pyke, who worked for Combined Operations Headquarters. After promising scale tests and the creation of a prototype on a lake (Patricia Lake, Jasper National Park) in Alberta, Canada, the project was shelved due to rising costs, added requirements, and the availability of longer-range aircraft and escort carriers which closed the Mid-Atlantic gap the project was intended to address.

Discussed on

πŸ”— Visualizing Pi (Ο€)

An animated image showing the definition of pi. A number line is marked off by a circle of unit diameter. Starting from zero, the circle "unrolls" its circumference. At one full turn, the unrolled circumference has extended to the point we call Ο€. This number is real but irrational, transcendental, and cannot be constructed with compass and straightedge.

Animation credit: John Reid
Archive - More featured pictures...

Discussed on

πŸ”— Teuvo Kohonen Has Died

πŸ”— Biography

Teuvo Kalevi Kohonen (11 July 1934 – 15 December 2021) was a prominent Finnish academic (Dr. Eng.) and researcher. He was professor emeritus of the Academy of Finland.

Prof. Kohonen made many contributions to the field of artificial neural networks, including the Learning Vector Quantization algorithm, fundamental theories of distributed associative memory and optimal associative mappings, the learning subspace method and novel algorithms for symbol processing like redundant hash addressing. He has published several books and over 300 peer-reviewed papers.

Kohonen’s most famous contribution is the Self-Organizing Map (also known as the Kohonen map or Kohonen artificial neural networks, although Kohonen himself prefers SOM). Due to the popularity of the SOM algorithm in many research and in practical applications, Kohonen is often considered to be the most cited Finnish scientist. The current version of the SOM bibliography contains close to 8000 entries.

During most of his career, Prof. Kohonen conducted research at Helsinki University of Technology (TKK). The Neural Networks Research Centre of TKK, a center of excellence appointed by Academy of Finland was founded to conduct research related to Teuvo Kohonen's innovations. After Kohonen's retirement, the center was led by Prof. Erkki Oja and later renamed to Adaptive Informatics Research Centre with widened foci of research.

Teuvo Kohonen was elected the First Vice President of the International Association for Pattern Recognition from 1982 to 1984, and acted as the first president of the European Neural Network Society from 1991 to 1992.

For his scientific achievements, Prof. Kohonen has received a number of prizes including the following:

  • IEEE Neural Networks Council Pioneer Award, 1991
  • Technical Achievement Award of the IEEE Signal Processing Society, 1995
  • IEEE Frank Rosenblatt Award, 2008

Discussed on

πŸ”— Preparedness Paradox

πŸ”— Disaster management

The preparedness paradox is the proposition that if a society or individual acts effectively to mitigate a potential disaster such as a pandemic, natural disaster or other catastrophe so that it causes less harm, the avoided danger will be perceived as having been much less serious because of the limited damage actually caused. The paradox is the incorrect perception that there had been no need for careful preparation as there was little harm, although in reality the limitation of the harm was due to preparation. Several cognitive biases can consequently hamper proper preparation for future risks.

Discussed on

πŸ”— Gilbert U-238 Atomic Energy Laboratory

πŸ”— Physics πŸ”— Education πŸ”— Toys

The Gilbert U-238 Atomic Energy Lab was a toy lab set designed to allow children to create and watch nuclear and chemical reactions using radioactive material. The Atomic Energy Lab was released by the A. C. Gilbert Company in 1950.

Discussed on

πŸ”— Model M keyboard

πŸ”— Computing πŸ”— Computing/Computer hardware

Model M designates a group of computer keyboards designed and manufactured by IBM starting in 1984, and later by Lexmark International, Maxi Switch, and Unicomp. The keyboard's many variations have their own distinct characteristics, with the vast majority having a buckling-spring key design and swappable keycaps. Model M keyboards have been praised by computer enthusiasts and frequent typists due to their durability and consistency, and the tactile and auditory feedback they provide.

The Model M is also regarded as a timeless and durable piece of hardware. Although the computers and computer peripherals produced concurrently with the Model M are considered obsolete, many Model M keyboards are still in use due to their physical durability and the continued validity of their ANSI 101-key and ISO 102-key layouts, through the use of a PS/2 female to USB male adapter with a built-in level converter. Since their original popularity, new generations of writers and computer technicians have rediscovered their unique functionality and aesthetics. The Kentucky-based company Unicomp continues to manufacture and sell Model M keyboards.

Discussed on

πŸ”— Internet 0

πŸ”— Computing πŸ”— Computing/Networking

Internet 0 is a low-speed physical layer designed to route 'IP over anything.' It was developed at MIT's Center for Bits and Atoms by Neil Gershenfeld, Raffi Krikorian, and Danny Cohen. When it was invented, a number of other proposals were being labelled as "internet 2." The name was chosen to emphasize that this was designed to be a slow, but very inexpensive internetworking system, and forestall "high-performance" comparison questions such as "how fast is it?"

Effectively, it would enable a platform for pervasive computing -- everything in a building could be on the same network to share data gathering and actuation. A light switch could turn on a light bulb by sending a packet to it, they can be linked together by the user.

Discussed on