Topic: Engineering (Page 4)

You are looking at all articles with the topic "Engineering". We found 43 matches.

Hint: To view all topics, click here. Too see the most popular topics, click here instead.

πŸ”— Tommy Flowers

πŸ”— Biography πŸ”— London πŸ”— Biography/science and academia πŸ”— Engineering

Thomas Harold Flowers MBE (22 December 1905 - 28 October 1998) was an English engineer with the British General Post Office. During World War II, Flowers designed and built Colossus, the world's first programmable electronic computer, to help decipher encrypted German messages.

πŸ”— Pareto Efficiency

πŸ”— Computer science πŸ”— Economics πŸ”— Engineering πŸ”— Gender Studies

Pareto efficiency or Pareto optimality is a situation where no action or allocation is available that makes one individual better off without making another worse off. The concept is named after Vilfredo Pareto (1848–1923), Italian civil engineer and economist, who used the concept in his studies of economic efficiency and income distribution. The following three concepts are closely related:

  • Given an initial situation, a Pareto improvement is a new situation where some agents will gain, and no agents will lose.
  • A situation is called Pareto-dominated or Pareto-inefficient if there is some possible Pareto improvement that has not been made.
  • A situation is called Pareto-optimal or Pareto-efficient if no change could lead to improved satisfaction for some agent without some other agent losing or, equivalently, if there is no scope for further Pareto improvement (in other words, the situation is not Pareto-dominated).

The Pareto front (also called Pareto frontier or Pareto set) is the set of all Pareto-efficient situations.

Pareto originally used the word "optimal" for the concept, but as it describes a situation where a limited number of people will be made better off under finite resources, and it does not take equality or social well-being into account, it is in effect a definition of and better captured by "efficiency".

In addition to the context of efficiency in allocation, the concept of Pareto efficiency also arises in the context of efficiency in production vs. x-inefficiency: a set of outputs of goods is Pareto-efficient if there is no feasible re-allocation of productive inputs such that output of one product increases while the outputs of all other goods either increase or remain the same.

Pareto efficiency is measured along the production possibility frontier (PPF), which is a graphical representation of all the possible options of output for two products that can be produced using all factors of production.

Besides economics, the notion of Pareto efficiency has been applied to the selection of alternatives in engineering and biology. Each option is first assessed, under multiple criteria, and then a subset of options is ostensibly identified with the property that no other option can categorically outperform the specified option. It is a statement of impossibility of improving one variable without harming other variables in the subject of multi-objective optimization (also termed Pareto optimization).

Discussed on

πŸ”— Hyperloop

πŸ”— Technology πŸ”— Physics πŸ”— Transport πŸ”— Trains πŸ”— Engineering

A Hyperloop is a proposed mode of passenger and freight transportation, first used to describe an open-source vactrain design released by a joint team from Tesla and SpaceX. Hyperloop is a sealed tube or system of tubes through which a pod may travel free of air resistance or friction conveying people or objects at high speed while being very efficient, thereby drastically reducing travel times over medium-range distances.

Elon Musk's version of the concept, first publicly mentioned in 2012, incorporates reduced-pressure tubes in which pressurized capsules ride on air bearings driven by linear induction motors and axial compressors.

The Hyperloop Alpha concept was first published in August 2013, proposing and examining a route running from the Los Angeles region to the San Francisco Bay Area, roughly following the Interstate 5 corridor. The Hyperloop Genesis paper conceived of a hyperloop system that would propel passengers along the 350-mile (560Β km) route at a speed of 760Β mph (1,200Β km/h), allowing for a travel time of 35 minutes, which is considerably faster than current rail or air travel times. Preliminary cost estimates for this LA–SF suggested route were included in the white paperβ€”US$6 billion for a passenger-only version, and US$7.5 billion for a somewhat larger-diameter version transporting passengers and vehiclesβ€”although transportation analysts had doubts that the system could be constructed on that budget; some analysts claimed that the Hyperloop would be several billion dollars overbudget, taking into consideration construction, development, and operation costs.

The Hyperloop concept has been explicitly "open-sourced" by Musk and SpaceX, and others have been encouraged to take the ideas and further develop them. To that end, a few companies have been formed, and several interdisciplinary student-led teams are working to advance the technology. SpaceX built an approximately 1-mile-long (1.6Β km) subscale track for its pod design competition at its headquarters in Hawthorne, California.

Discussed on

πŸ”— Citicorp Center Engineering Crisis

πŸ”— New York City πŸ”— Engineering

The Citicorp Center engineering crisis was the discovery, in 1978, of a significant structural flaw in Citicorp Center, then a recently completed skyscraper in New York City, and the subsequent effort to quietly make repairs over the next few months. The building, now known as Citigroup Center, occupied an entire block and was to be the headquarters of Citibank. Its structure, designed by William LeMessurier, had several unusual design features, including a raised base supported by four offset stilts, and diagonal bracing which absorbed wind loads from upper stories.

In the original design, potential wind loads for the building were calculated incorrectly. The flaw was discovered by Diane Hartley, an undergraduate student at Princeton University who was writing a thesis on the building, and was communicated to the firm responsible for the structural design. LeMessurier was subsequently lauded for acknowledging his error and orchestrating a successful repair effort. Estimates at the time suggested that the building could be toppled by a 70-mile-per-hour (110Β km/h) wind, with possibly many people killed as a result. The crisis was kept secret until 1995 and Hartley had no knowledge of the significance of her work until after that time.

Discussed on

πŸ”— Pareto Front

πŸ”— Computer science πŸ”— Economics πŸ”— Engineering

In multi-objective optimization, the Pareto front (also called Pareto frontier or Pareto curve) is the set of all Pareto efficient solutions. The concept is widely used in engineering.:β€Š111–148β€Š It allows the designer to restrict attention to the set of efficient choices, and to make tradeoffs within this set, rather than considering the full range of every parameter.:β€Š63–65β€Š:β€Š399–412β€Š

πŸ”— Computer

πŸ”— Technology πŸ”— Video games πŸ”— Computing πŸ”— Computer science πŸ”— Computing/Computer hardware πŸ”— Systems πŸ”— Computing/Software πŸ”— Engineering πŸ”— Home Living

A computer is a machine that can be instructed to carry out sequences of arithmetic or logical operations automatically via computer programming. Modern computers have the ability to follow generalized sets of operations, called programs. These programs enable computers to perform an extremely wide range of tasks. A "complete" computer including the hardware, the operating system (main software), and peripheral equipment required and used for "full" operation can be referred to as a computer system. This term may as well be used for a group of computers that are connected and work together, in particular a computer network or computer cluster.

Computers are used as control systems for a wide variety of industrial and consumer devices. This includes simple special purpose devices like microwave ovens and remote controls, factory devices such as industrial robots and computer-aided design, and also general purpose devices like personal computers and mobile devices such as smartphones. The Internet is run on computers and it connects hundreds of millions of other computers and their users.

Early computers were only conceived as calculating devices. Since ancient times, simple manual devices like the abacus aided people in doing calculations. Early in the Industrial Revolution, some mechanical devices were built to automate long tedious tasks, such as guiding patterns for looms. More sophisticated electrical machines did specialized analog calculations in the early 20th century. The first digital electronic calculating machines were developed during World War II. The first semiconductor transistors in the late 1940s were followed by the silicon-based MOSFET (MOS transistor) and monolithic integrated circuit (IC) chip technologies in the late 1950s, leading to the microprocessor and the microcomputer revolution in the 1970s. The speed, power and versatility of computers have been increasing dramatically ever since then, with MOS transistor counts increasing at a rapid pace (as predicted by Moore's law), leading to the Digital Revolution during the late 20th to early 21st centuries.

Conventionally, a modern computer consists of at least one processing element, typically a central processing unit (CPU) in the form of a metal-oxide-semiconductor (MOS) microprocessor, along with some type of computer memory, typically MOS semiconductor memory chips. The processing element carries out arithmetic and logical operations, and a sequencing and control unit can change the order of operations in response to stored information. Peripheral devices include input devices (keyboards, mice, joystick, etc.), output devices (monitor screens, printers, etc.), and input/output devices that perform both functions (e.g., the 2000s-era touchscreen). Peripheral devices allow information to be retrieved from an external source and they enable the result of operations to be saved and retrieved.

πŸ”— Overengineering – I see this every day, please stop

πŸ”— Technology πŸ”— Engineering

Overengineering (or over-engineering, or over-kill) is the act of designing a product to be more robust or have more features than often necessary for its intended use, or for a process to be unnecessarily complex or inefficient.

Overengineering is often done to increase a factor of safety, add functionality, or overcome perceived design flaws that most users would accept.

Overengineering can be desirable when safety or performance is critical (e.g. in aerospace vehicles and luxury road vehicles), or when extremely broad functionality is required (e.g. diagnostic and medical tools, power users of products), but it is generally criticized in terms of value engineering as wasteful of resources such as materials, time and money.

As a design philosophy, it is the opposite of the minimalist ethos of "less is more" (or: β€œworse is better”) and a disobedience of the KISS principle.

Overengineering generally occurs in high-end products or specialized markets. In one form, products are overbuilt and have performance far in excess of expected normal operation (a city car that can travel at 300Β km/h, or a home video recorder with a projected lifespan of 100 years), and hence are more expensive, bulkier, and heavier than necessary. Alternatively, they may become overcomplicated – the extra functions may be unnecessary, and potentially reduce the usability of the product by overwhelming lesser experienced and technically literate end users, as in feature creep.

Overengineering can decrease the productivity of design teams, because of the need to build and maintain more features than most users need.

A related issue is market segmentation – making different products for different market segments. In this context, a particular product may be more or less suited (and thus considered over- or under-engineered) for a particular market segment.

Discussed on

πŸ”— Speed Tape

πŸ”— Aviation πŸ”— Military history πŸ”— Military history/Military aviation πŸ”— Engineering πŸ”— Industrial design

Speed tape is an aluminium pressure-sensitive tape used to perform minor repairs on aircraft and racing cars. It is used as a temporary repair material until a more permanent repair can be carried out. It has an appearance similar to duct tape, for which it is sometimes mistaken, but its adhesive is capable of sticking on an airplane fuselage or wing at high speeds, hence the name.

πŸ”— Terahertz Gap

πŸ”— Technology πŸ”— Physics πŸ”— Radio πŸ”— Astronomy πŸ”— Engineering

In engineering, the terahertz gap is a frequency band in the terahertz region of the electromagnetic spectrum between radio waves and infrared light for which practical technologies for generating and detecting the radiation do not exist. It is defined as 0.1 to 10Β THz (wavelengths of 3Β mm to 30Β Β΅m). Currently, at frequencies within this range, useful power generation and receiver technologies are inefficient and unfeasible.

Mass production of devices in this range and operation at room temperature (at which energy kΒ·T is equal to the energy of a photon with a frequency of 6.2Β THz) are mostly impractical. This leaves a gap between mature microwave technologies in the highest frequencies of the radio spectrum and the well developed optical engineering of infrared detectors in their lowest frequencies. This radiation is mostly used in small-scale, specialized applications such as submillimetre astronomy. Research that attempts to resolve this issue has been conducted since the late 20thΒ century.

Discussed on

πŸ”— Muntzing

πŸ”— Technology πŸ”— Electronics πŸ”— Engineering πŸ”— Industrial design

Muntzing is the practice and technique of reducing the components inside an electronic appliance to the minimum required for it to sufficiently function in most operating conditions, reducing design margins above minimum requirements toward zero. The term is named after the man who invented it, Earl "Madman" Muntz, a car and electronics salesman, who was not formally educated or trained in any science or engineering discipline.

In the 1940s and 1950s, television receivers were relatively new to the consumer market, and were more complex pieces of equipment than the radios which were then in popular use. TVs often contained upwards of 30 vacuum tubes, as well as transformers, rheostats, and other electronics. The consequence of high cost was high sales pricing, limiting potential for high-volume sales. Muntz expressed suspicion of complexity in circuit designs, and determined through simple trial and error that he could remove a significant number of electronic components from a circuit design and still end up with a monochrome TV that worked sufficiently well in urban areas, close to transmission towers where the broadcast signal was strong. He carried a pair of wire clippers, and when he felt that one of his builders was overengineering a circuit, he would begin snipping out some of the electronics components. When the TV stopped functioning, he would have the technician reinsert the last removed part. He would repeat the snipping in other portions of the circuit until he was satisfied in his simplification efforts, and then leave the TV as it was without further testing in more adverse conditions for signal reception.

As a result, he reduced his costs and increased his profits at the expense of poorer performance at locations more distant from urban centers. He reasoned that population density was higher in and near the urban centers where the TVs would work, and lower further out where the TVs would not work, so the Muntz TVs were adequate for a very large fraction of his customers. And for those further out, where the Muntz TVs did not work, those could be returned at the customer's additional effort and expense, and not Muntz's. He focused less resources in the product, intentionally accepting bare minimum performance quality, and focused more resources on advertising and sales promotions.

Discussed on