Topic: Computer science (Page 4)

You are looking at all articles with the topic "Computer science". We found 123 matches.

Hint: To view all topics, click here. Too see the most popular topics, click here instead.

🔗 Diffie-Hellman key exchange, explained using colors

🔗 Computing 🔗 Computer science 🔗 Cryptography 🔗 Cryptography/Computer science

Diffie–Hellman key exchange is a method of securely exchanging cryptographic keys over a public channel and was one of the first public-key protocols as conceived by Ralph Merkle and named after Whitfield Diffie and Martin Hellman. DH is one of the earliest practical examples of public key exchange implemented within the field of cryptography.

Traditionally, secure encrypted communication between two parties required that they first exchange keys by some secure physical means, such as paper key lists transported by a trusted courier. The Diffie–Hellman key exchange method allows two parties that have no prior knowledge of each other to jointly establish a shared secret key over an insecure channel. This key can then be used to encrypt subsequent communications using a symmetric key cipher.

Diffie–Hellman is used to secure a variety of Internet services. However, research published in October 2015 suggests that the parameters in use for many DH Internet applications at that time are not strong enough to prevent compromise by very well-funded attackers, such as the security services of large governments.

The scheme was published by Whitfield Diffie and Martin Hellman in 1976, but in 1997 it was revealed that James H. Ellis, Clifford Cocks, and Malcolm J. Williamson of GCHQ, the British signals intelligence agency, had previously shown in 1969 how public-key cryptography could be achieved.

Although Diffie–Hellman key agreement itself is a non-authenticated key-agreement protocol, it provides the basis for a variety of authenticated protocols, and is used to provide forward secrecy in Transport Layer Security's ephemeral modes (referred to as EDH or DHE depending on the cipher suite).

The method was followed shortly afterwards by RSA, an implementation of public-key cryptography using asymmetric algorithms.

Expired U.S. Patent 4,200,770 from 1977 describes the now public-domain algorithm. It credits Hellman, Diffie, and Merkle as inventors.

Discussed on

🔗 Amdahl's Law

🔗 Computer science

In computer architecture, Amdahl's law (or Amdahl's argument) is a formula which gives the theoretical speedup in latency of the execution of a task at fixed workload that can be expected of a system whose resources are improved. It is named after computer scientist Gene Amdahl, and was presented at the AFIPS Spring Joint Computer Conference in 1967.

Amdahl's law is often used in parallel computing to predict the theoretical speedup when using multiple processors. For example, if a program needs 20 hours to complete using a single thread, but a one-hour portion of the program cannot be parallelized, therefore only the remaining 19 hours (p = 0.95) of execution time can be parallelized, then regardless of how many threads are devoted to a parallelized execution of this program, the minimum execution time cannot be less than one hour. Hence, the theoretical speedup is limited to at most 20 times the single thread performance, ( 1 1 − p = 20 ) {\displaystyle \left({\dfrac {1}{1-p}}=20\right)} .

Discussed on

🔗 Eiffel programming language

🔗 Computing 🔗 Computer science

Eiffel is an object-oriented programming language designed by Bertrand Meyer (an object-orientation proponent and author of Object-Oriented Software Construction) and Eiffel Software. Meyer conceived the language in 1985 with the goal of increasing the reliability of commercial software development; the first version becoming available in 1986. In 2005, Eiffel became an ISO-standardized language.

The design of the language is closely connected with the Eiffel programming method. Both are based on a set of principles, including design by contract, command–query separation, the uniform-access principle, the single-choice principle, the open–closed principle, and option–operand separation.

Many concepts initially introduced by Eiffel later found their way into Java, C#, and other languages. New language design ideas, particularly through the Ecma/ISO standardization process, continue to be incorporated into the Eiffel language.

Discussed on

🔗 Hutter Prize

🔗 Computer science

The Hutter Prize is a cash prize funded by Marcus Hutter which rewards data compression improvements on a specific 1 GB English text file. Specifically, the prize awards 5000 euros for each one percent improvement (with 500,000 euros total funding) in the compressed size of the file enwik9, which is the larger of two files used in the Large Text Compression Benchmark; enwik9 is the first 1,000,000,000 characters of a specific version of English Wikipedia. The ongoing competition is organized by Hutter, Matt Mahoney, and Jim Bowery.

Discussed on

🔗 Jackson structured programming

🔗 Computer science 🔗 Systems 🔗 Systems/Scientific modeling

Jackson structured programming (JSP) is a method for structured programming developed by British software consultant Michael A. Jackson and described in his 1975 book Principles of Program Design. The technique of JSP is to analyze the data structures of the files that a program must read as input and produce as output, and then produce a program design based on those data structures, so that the program control structure handles those data structures in a natural and intuitive way.

JSP describes structures (of both data and programs) using three basic structures – sequence, iteration, and selection (or alternatives). These structures are diagrammed as (in effect) a visual representation of a regular expression.

Discussed on

🔗 Polytope Model

🔗 Computer science

The polyhedral model (also called the polytope method) is a mathematical framework for programs that perform large numbers of operations -- too large to be explicitly enumerated -- thereby requiring a compact representation. Nested loop programs are the typical, but not the only example, and the most common use of the model is for loop nest optimization in program optimization. The polyhedral method treats each loop iteration within nested loops as lattice points inside mathematical objects called polyhedra, performs affine transformations or more general non-affine transformations such as tiling on the polytopes, and then converts the transformed polytopes into equivalent, but optimized (depending on targeted optimization goal), loop nests through polyhedra scanning.

Discussed on

🔗 COMEFROM

🔗 Computing 🔗 Computer science

In computer programming, COMEFROM (or COME FROM) is an obscure control flow structure used in some programming languages, originally as a joke. COMEFROM is roughly the opposite of GOTO in that it can take the execution state from any arbitrary point in code to a COMEFROM statement.

The point in code where the state transfer happens is usually given as a parameter to COMEFROM. Whether the transfer happens before or after the instruction at the specified transfer point depends on the language used. Depending on the language used, multiple COMEFROMs referencing the same departure point may be invalid, be non-deterministic, be executed in some sort of defined priority, or even induce parallel or otherwise concurrent execution as seen in Threaded Intercal.

A simple example of a "COMEFROM x" statement is a label x (which does not need to be physically located anywhere near its corresponding COMEFROM) that acts as a "trap door". When code execution reaches the label, control gets passed to the statement following the COMEFROM. This may also be conditional, passing control only if a condition is satisfied, analogous to a GOTO within an IF statement. The primary difference from GOTO is that GOTO only depends on the local structure of the code, while COMEFROM depends on the global structure – a GOTO transfers control when it reaches a line with a GOTO statement, while COMEFROM requires scanning the entire program or scope to see if any COMEFROM statements are in scope for the line, and then verifying if a condition is hit. The effect of this is primarily to make debugging (and understanding the control flow of the program) extremely difficult, since there is no indication near the line or label in question that control will mysteriously jump to another point of the program – one must study the entire program to see if any COMEFROM statements reference that line or label.

Debugger hooks can be used to implement a COMEFROM statement, as in the humorous Python goto module; see below. This also can be implemented with the gcc feature "asm goto" as used by the Linux kernel configuration option CONFIG_JUMP_LABEL. A no-op has its location stored, to be replaced by a jump to an executable fragment that at its end returns to the instruction after the no-op.

Discussed on

🔗 Knuth–Morris–Pratt algorithm

🔗 Computer science

In computer science, the Knuth–Morris–Pratt string-searching algorithm (or KMP algorithm) searches for occurrences of a "word" W within a main "text string" S by employing the observation that when a mismatch occurs, the word itself embodies sufficient information to determine where the next match could begin, thus bypassing re-examination of previously matched characters.

The algorithm was conceived by James H. Morris and independently discovered by Donald Knuth "a few weeks later" from automata theory. Morris and Vaughan Pratt published a technical report in 1970. The three also published the algorithm jointly in 1977. Independently, in 1969, Matiyasevich discovered a similar algorithm, coded by a two-dimensional Turing machine, while studying a string-pattern-matching recognition problem over a binary alphabet. This was the first linear-time algorithm for string matching.

Discussed on

🔗 Q (Number Format)

🔗 Computer science

The Q notation is a succinct way to specify the parameters of a binary fixed point number format. A number of other notations have been used for the same purpose.

Discussed on

🔗 List of unsolved problems in computer science

🔗 Computing 🔗 Computer science 🔗 Computing/Computer science

This article is a list of notable unsolved problems in computer science. A problem in computer science is considered unsolved when no solution is known, or when experts in the field disagree about proposed solutions.

Discussed on