Topic: Computing (Page 15)

You are looking at all articles with the topic "Computing". We found 481 matches.

Hint: To view all topics, click here. Too see the most popular topics, click here instead.

πŸ”— Cyclomatic Complexity

πŸ”— Computing πŸ”— Computer science πŸ”— Computing/Software

Cyclomatic complexity is a software metric used to indicate the complexity of a program. It is a quantitative measure of the number of linearly independent paths through a program's source code. It was developed by Thomas J. McCabe, Sr. in 1976.

Cyclomatic complexity is computed using the control-flow graph of the program: the nodes of the graph correspond to indivisible groups of commands of a program, and a directed edge connects two nodes if the second command might be executed immediately after the first command. Cyclomatic complexity may also be applied to individual functions, modules, methods or classes within a program.

One testing strategy, called basis path testing by McCabe who first proposed it, is to test each linearly independent path through the program; in this case, the number of test cases will equal the cyclomatic complexity of the program.

Discussed on

πŸ”— Overlapping Markup

πŸ”— Computing

In markup languages and the digital humanities, overlap occurs when a document has two or more structures that interact in a non-hierarchical manner. A document with overlapping markup cannot be represented as a tree. This is also known as concurrent markup. Overlap happens, for instance, in poetry, where there may be a metrical structure of feet and lines; a linguistic structure of sentences and quotations; and a physical structure of volumes and pages and editorial annotations.

Discussed on

πŸ”— SuperDisk

πŸ”— Computing πŸ”— Computing/Computer hardware

Not to be confused with SuperDrive, a trademark used by Apple Computer for various disk drive products or the Super Disc, CD addon for the Super Nintendo Entertainment System.

The SuperDisk LS-120 is a high-speed, high-capacity alternative to the 90Β mm (3.5Β in), 1.44Β MB floppy disk. The SuperDisk hardware was created by 3M's storage products group Imation in 1997, with manufacturing chiefly by Matsushita.

The SuperDisk had little success in North America; with Compaq, Gateway and Dell being three of only a few OEMs who supported it. It was more successful in Asia and Australia, where the second-generation SuperDisk LS-240 drive and disk was released. SuperDisk worldwide ceased manufacturing in 2003.

Discussed on

πŸ”— Canon Cat

πŸ”— Computing

The Canon Cat was a task-dedicated, desktop computer released by Canon Inc. in 1987 at a price of US$1,495. On the surface it was not unlike the dedicated word processors popular in the late 1970s to early 1980s, but it was far more powerful and incorporated many unique ideas for data manipulation.

Discussed on

πŸ”— Programma 101, the first commercial β€œdesktop computer”

πŸ”— Computing

The Olivetti Programma 101, also known as Perottina or P101, is one of the first "all in one" commercial programmable desktop calculators, although not the first. Produced by Italian manufacturer Olivetti, based in Ivrea, Piedmont, and invented by the Italian engineer Pier Giorgio Perotto, the P101 has the main features of large computers of that period. It was launched at the 1964 New York World's Fair; volume production started in 1965. A futuristic design for its time, the Programma 101 was priced at $3,200 (equivalent to $26,000 in 2019). About 44,000 units were sold, primarily in the US.

It is usually called a printing programmable calculator or desktop calculator because its arithmetic instructions correspond to calculator operations.

Discussed on

πŸ”— Coherent OS

πŸ”— Computing πŸ”— Computing/Software πŸ”— Computing/Free and open-source software

Coherent is a clone of the Unix operating system for IBM PC compatibles and other microcomputers, developed and sold by the now-defunct Mark Williams Company (MWC). Historically, the operating system was a proprietary product, but it became open source in 2015, released under a 3-clause BSD License.

Discussed on

πŸ”— Law of Demeter

πŸ”— Computing

The Law of Demeter (LoD) or principle of least knowledge is a design guideline for developing software, particularly object-oriented programs. In its general form, the LoD is a specific case of loose coupling. The guideline was proposed by Ian Holland at Northeastern University towards the end of 1987, and can be succinctly summarized in each of the following ways:

  • Each unit should have only limited knowledge about other units: only units "closely" related to the current unit.
  • Each unit should only talk to its friends; don't talk to strangers.
  • Only talk to your immediate friends.

The fundamental notion is that a given object should assume as little as possible about the structure or properties of anything else (including its subcomponents), in accordance with the principle of "information hiding". It may be viewed as a corollary to the principle of least privilege, which dictates that a module possess only the information and resources necessary for its legitimate purpose.

It is so named for its origin in the Demeter Project, an adaptive programming and aspect-oriented programming effort. The project was named in honor of Demeter, β€œdistribution-mother” and the Greek goddess of agriculture, to signify a bottom-up philosophy of programming which is also embodied in the law itself.

Discussed on

πŸ”— AT&T Hobbit

πŸ”— United States πŸ”— Computing πŸ”— Computing/Computer hardware πŸ”— Plan 9

The AT&T Hobbit is a microprocessor design that AT&T Corporation developed in the early 1990s. It was based on the company's CRISP (C-language Reduced Instruction Set Processor) design, which in turn grew out of Bell Labs' C Machine design of the late 1980s. CΒ Machine, CRISP and Hobbit were optimized for running the C programming language. The design concentrated on fast instruction decoding, indexed array access and procedure calls. Its processor was partially RISC-like. The project ended in 1994 because the Hobbit failed to achieve commercially viable sales.

Discussed on

πŸ”— Piet is a programming language, whose programs look like abstract art.

πŸ”— Computing πŸ”— Computer science πŸ”— Comedy

An esoteric programming language (sometimes shortened to esolang) is a programming language designed to test the boundaries of computer programming language design, as a proof of concept, as software art, as a hacking interface to another language (particularly functional programming or procedural programming languages), or as a joke. The use of esoteric distinguishes these languages from programming languages that working developers use to write software. Usually, an esolang's creators do not intend the language to be used for mainstream programming, although some esoteric features, such as visuospatial syntax, have inspired practical applications in the arts. Such languages are often popular among hackers and hobbyists.

Usability is rarely a goal for esoteric programming language designersβ€”often the design leads to quite the opposite. Their usual aim is to remove or replace conventional language features while still maintaining a language that is Turing-complete, or even one for which the computational class is unknown.

Discussed on

πŸ”— AI Winter

πŸ”— United States/U.S. Government πŸ”— United States πŸ”— Technology πŸ”— Computing πŸ”— Systems πŸ”— Cognitive science πŸ”— Linguistics πŸ”— Computing/Computer science πŸ”— Robotics πŸ”— Transhumanism πŸ”— Linguistics/Applied Linguistics πŸ”— Systems/Cybernetics

In the history of artificial intelligence, an AI winter is a period of reduced funding and interest in artificial intelligence research. The term was coined by analogy to the idea of a nuclear winter. The field has experienced several hype cycles, followed by disappointment and criticism, followed by funding cuts, followed by renewed interest years or decades later.

The term first appeared in 1984 as the topic of a public debate at the annual meeting of AAAI (then called the "American Association of Artificial Intelligence"). It is a chain reaction that begins with pessimism in the AI community, followed by pessimism in the press, followed by a severe cutback in funding, followed by the end of serious research. At the meeting, Roger Schank and Marvin Minskyβ€”two leading AI researchers who had survived the "winter" of the 1970sβ€”warned the business community that enthusiasm for AI had spiraled out of control in the 1980s and that disappointment would certainly follow. Three years later, the billion-dollar AI industry began to collapse.

Hype is common in many emerging technologies, such as the railway mania or the dot-com bubble. The AI winter was a result of such hype, due to over-inflated promises by developers, unnaturally high expectations from end-users, and extensive promotion in the media . Despite the rise and fall of AI's reputation, it has continued to develop new and successful technologies. AI researcher Rodney Brooks would complain in 2002 that "there's this stupid myth out there that AI has failed, but AI is around you every second of the day." In 2005, Ray Kurzweil agreed: "Many observers still think that the AI winter was the end of the story and that nothing since has come of the AI field. Yet today many thousands of AI applications are deeply embedded in the infrastructure of every industry."

Enthusiasm and optimism about AI has increased since its low point in the early 1990s. Beginning about 2012, interest in artificial intelligence (and especially the sub-field of machine learning) from the research and corporate communities led to a dramatic increase in funding and investment.

Discussed on