Topic: Systems (Page 5)

You are looking at all articles with the topic "Systems". We found 53 matches.

Hint: To view all topics, click here. Too see the most popular topics, click here instead.

πŸ”— History of software engineering

πŸ”— Computing πŸ”— Systems πŸ”— Computing/Software πŸ”— Systems/Software engineering

From its beginnings in the 1960s, writing software has evolved into a profession concerned with how best to maximize the quality of software and of how to create it. Quality can refer to how maintainable software is, to its stability, speed, usability, testability, readability, size, cost, security, and number of flaws or "bugs", as well as to less measurable qualities like elegance, conciseness, and customer satisfaction, among many other attributes. How best to create high quality software is a separate and controversial problem covering software design principles, so-called "best practices" for writing code, as well as broader management issues such as optimal team size, process, how best to deliver software on time and as quickly as possible, work-place "culture", hiring practices, and so forth. All this falls under the broad rubric of software engineering.

Discussed on

πŸ”— Work expands so as to fill the time available for its completion

πŸ”— Economics πŸ”— Systems πŸ”— Business πŸ”— Sociology πŸ”— Organizations πŸ”— Engineering πŸ”— Systems/Project management

Parkinson's law is the adage that "work expands so as to fill the time available for its completion". It is sometimes applied to the growth of bureaucracy in an organization.

Discussed on

πŸ”— Emergence

πŸ”— Biology πŸ”— Physics πŸ”— Economics πŸ”— Philosophy πŸ”— Systems πŸ”— Philosophy/Philosophy of science πŸ”— Philosophy/Epistemology

In philosophy, systems theory, science, and art, emergence occurs when an entity is observed to have properties its parts do not have on their own. These properties or behaviors emerge only when the parts interact in a wider whole. For example, smooth forward motion emerges when a bicycle and its rider interoperate, but neither part can produce the behavior on their own.

Emergence plays a central role in theories of integrative levels and of complex systems. For instance, the phenomenon of life as studied in biology is an emergent property of chemistry, and psychological phenomena emerge from the neurobiological phenomena of living things.

In philosophy, theories that emphasize emergent properties have been called emergentism. Almost all accounts of emergentism include a form of epistemic or ontological irreducibility to the lower levels.

Discussed on

πŸ”— Weierstrass Function

πŸ”— Mathematics πŸ”— Systems πŸ”— Systems/Chaos theory

In mathematics, the Weierstrass function is an example of a real-valued function that is continuous everywhere but differentiable nowhere. It is an example of a fractal curve. It is named after its discoverer Karl Weierstrass.

The Weierstrass function has historically served the role of a pathological function, being the first published example (1872) specifically concocted to challenge the notion that every continuous function is differentiable except on a set of isolated points. Weierstrass's demonstration that continuity did not imply almost-everywhere differentiability upended mathematics, overturning several proofs that relied on geometric intuition and vague definitions of smoothness. These types of functions were denounced by contemporaries: Henri PoincarΓ© famously described them as "monsters" and called Weierstrass' work "an outrage against common sense", while Charles Hermite wrote that they were a "lamentable scourge". The functions were difficult to visualize until the arrival of computers in the next century, and the results did not gain wide acceptance until practical applications such as models of Brownian motion necessitated infinitely jagged functions (nowadays known as fractal curves).

Discussed on

πŸ”— The Limits to Growth (1972)

πŸ”— Climate change πŸ”— Environment πŸ”— Books πŸ”— Systems πŸ”— Futures studies πŸ”— Energy

The Limits to Growth (often abbreviated LTG) is a 1972 report that discussed the possibility of exponential economic and population growth with finite supply of resources, studied by computer simulation. The study used the World3 computer model to simulate the consequence of interactions between the Earth and human systems. The model was based on the work of Jay Forrester of MIT,:β€Š21β€Š as described in his book World Dynamics.

Commissioned by the Club of Rome, the study saw its findings first presented at international gatherings in Moscow and Rio de Janeiro in the summer of 1971.:β€Š186β€Š The report's authors are Donella H. Meadows, Dennis L. Meadows, JΓΈrgen Randers, and William W. Behrens III, representing a team of 17 researchers.:β€Š8β€Š

The report's findings suggest that, in the absence of significant alterations in resource utilization, it is highly likely that there will be an abrupt and unmanageable decrease in both population and industrial capacity. Despite the report's facing severe criticism and scrutiny upon its release, subsequent research consistently finds that the global use of natural resources has been inadequately reformed since to alter its basic predictions.

Since its publication, some 30 million copies of the book in 30 languages have been purchased. It continues to generate debate and has been the subject of several subsequent publications.

Beyond the Limits and The Limits to Growth: The 30-Year Update were published in 1992 and 2004 respectively; in 2012, a 40-year forecast from JΓΈrgen Randers, one of the book's original authors, was published as 2052: A Global Forecast for the Next Forty Years; and in 2022 two of the original Limits to Growth authors, Dennis Meadows and JΓΈrgen Randers, joined 19 other contributors to produce Limits and Beyond.

πŸ”— Expert System

πŸ”— Computer science πŸ”— Systems πŸ”— Human–Computer Interaction

In artificial intelligence, an expert system is a computer system emulating the decision-making ability of a human expert. Expert systems are designed to solve complex problems by reasoning through bodies of knowledge, represented mainly as if–then rules rather than through conventional procedural code. The first expert systems were created in the 1970s and then proliferated in the 1980s. Expert systems were among the first truly successful forms of artificial intelligence (AI) software. An expert system is divided into two subsystems: the inference engine and the knowledge base. The knowledge base represents facts and rules. The inference engine applies the rules to the known facts to deduce new facts. Inference engines can also include explanation and debugging abilities.

πŸ”— Clifford A. Pickover

πŸ”— Biography πŸ”— Systems πŸ”— Biography/science and academia πŸ”— Journalism πŸ”— Systems/Visualization

Clifford Alan Pickover (born August 15, 1957) is an American author, editor, and columnist in the fields of science, mathematics, science fiction, innovation, and creativity. For many years, he was employed at the IBM Thomas J. Watson Research Center in Yorktown, New York where he was Editor-in-Chief of the IBM Journal of Research and Development. He has been granted more than 500 U.S. patents, is an elected Fellow for the Committee for Skeptical Inquiry, and is author of more than 50 books, translated into more than a dozen languages.

Discussed on

πŸ”— Computer

πŸ”— Technology πŸ”— Video games πŸ”— Computing πŸ”— Computer science πŸ”— Computing/Computer hardware πŸ”— Systems πŸ”— Computing/Software πŸ”— Engineering πŸ”— Home Living

A computer is a machine that can be instructed to carry out sequences of arithmetic or logical operations automatically via computer programming. Modern computers have the ability to follow generalized sets of operations, called programs. These programs enable computers to perform an extremely wide range of tasks. A "complete" computer including the hardware, the operating system (main software), and peripheral equipment required and used for "full" operation can be referred to as a computer system. This term may as well be used for a group of computers that are connected and work together, in particular a computer network or computer cluster.

Computers are used as control systems for a wide variety of industrial and consumer devices. This includes simple special purpose devices like microwave ovens and remote controls, factory devices such as industrial robots and computer-aided design, and also general purpose devices like personal computers and mobile devices such as smartphones. The Internet is run on computers and it connects hundreds of millions of other computers and their users.

Early computers were only conceived as calculating devices. Since ancient times, simple manual devices like the abacus aided people in doing calculations. Early in the Industrial Revolution, some mechanical devices were built to automate long tedious tasks, such as guiding patterns for looms. More sophisticated electrical machines did specialized analog calculations in the early 20th century. The first digital electronic calculating machines were developed during World War II. The first semiconductor transistors in the late 1940s were followed by the silicon-based MOSFET (MOS transistor) and monolithic integrated circuit (IC) chip technologies in the late 1950s, leading to the microprocessor and the microcomputer revolution in the 1970s. The speed, power and versatility of computers have been increasing dramatically ever since then, with MOS transistor counts increasing at a rapid pace (as predicted by Moore's law), leading to the Digital Revolution during the late 20th to early 21st centuries.

Conventionally, a modern computer consists of at least one processing element, typically a central processing unit (CPU) in the form of a metal-oxide-semiconductor (MOS) microprocessor, along with some type of computer memory, typically MOS semiconductor memory chips. The processing element carries out arithmetic and logical operations, and a sequencing and control unit can change the order of operations in response to stored information. Peripheral devices include input devices (keyboards, mice, joystick, etc.), output devices (monitor screens, printers, etc.), and input/output devices that perform both functions (e.g., the 2000s-era touchscreen). Peripheral devices allow information to be retrieved from an external source and they enable the result of operations to be saved and retrieved.

πŸ”— Edward Tufte

πŸ”— Biography πŸ”— Mathematics πŸ”— Statistics πŸ”— Systems πŸ”— Biography/science and academia πŸ”— Systems/Visualization πŸ”— Graphic design

Edward Rolf Tufte (; born March 14, 1942) is an American statistician and professor emeritus of political science, statistics, and computer science at Yale University. He is noted for his writings on information design and as a pioneer in the field of data visualization.

Discussed on

πŸ”— Extreme Programming

πŸ”— Computing πŸ”— Systems πŸ”— Systems/Software engineering πŸ”— Method engineering

Extreme programming (XP) is a software development methodology intended to improve software quality and responsiveness to changing customer requirements. As a type of agile software development, it advocates frequent releases in short development cycles, intended to improve productivity and introduce checkpoints at which new customer requirements can be adopted.

Other elements of extreme programming include programming in pairs or doing extensive code review, unit testing of all code, not programming features until they are actually needed, a flat management structure, code simplicity and clarity, expecting changes in the customer's requirements as time passes and the problem is better understood, and frequent communication with the customer and among programmers. The methodology takes its name from the idea that the beneficial elements of traditional software engineering practices are taken to "extreme" levels. As an example, code reviews are considered a beneficial practice; taken to the extreme, code can be reviewed continuously (i.e. the practice of pair programming).