Topic: Physics (Page 7)

You are looking at all articles with the topic "Physics". We found 181 matches.

Hint: To view all topics, click here. Too see the most popular topics, click here instead.

🔗 Phased Array

🔗 Technology 🔗 Physics 🔗 Telecommunications 🔗 Radio 🔗 Electronics 🔗 Engineering

In antenna theory, a phased array usually means an electronically scanned array, a computer-controlled array of antennas which creates a beam of radio waves that can be electronically steered to point in different directions without moving the antennas. The general theory of an electromagnetic phased array also finds applications in ultrasonic and medical imaging application (phased array ultrasonics) and in optics optical phased array.

In a simple array antenna, the radio frequency current from the transmitter is fed to multiple individual antenna elements with the proper phase relationship so that the radio waves from the separate elements combine (superpose) to form beams, to increase power radiated in desired directions and suppress radiation in undesired directions. In a phased array, the power from the transmitter is fed to the radiating elements through devices called phase shifters, controlled by a computer system, which can alter the phase or signal delay electronically, thus steering the beam of radio waves to a different direction. Since the size of an antenna array must extend many wavelengths to achieve the high gain needed for narrow beamwidth, phased arrays are mainly practical at the high frequency end of the radio spectrum, in the UHF and microwave bands, in which the operating wavelengths are conveniently small.

Phased arrays were originally conceived for use in military radar systems, to steer a beam of radio waves quickly across the sky to detect planes and missiles. These systems are now widely used and have spread to civilian applications such as 5G MIMO for cell phones. The phased array principle is also used in acoustics, and phased arrays of acoustic transducers are used in medical ultrasound imaging scanners (phased array ultrasonics), oil and gas prospecting (reflection seismology), and military sonar systems.

The term "phased array" is also used to a lesser extent for unsteered array antennas in which the phase of the feed power and thus the radiation pattern of the antenna array is fixed. For example, AM broadcast radio antennas consisting of multiple mast radiators fed so as to create a specific radiation pattern are also called "phased arrays".

Discussed on

🔗 Elitzur–Vaidman bomb tester

🔗 Physics

The Elitzur–Vaidman bomb-tester is a quantum mechanics thought experiment that uses interaction-free measurements to verify that a bomb is functional without having to detonate it. It was conceived in 1993 by Avshalom Elitzur and Lev Vaidman. Since their publication, real-world experiments have confirmed that their theoretical method works as predicted.

The bomb tester takes advantage of two characteristics of elementary particles, such as photons or electrons: nonlocality and wave-particle duality. By placing the particle in a quantum superposition, it is possible for the experiment to verify that the bomb works without triggering its detonation, although there is still a 50% chance that the bomb will detonate in the effort.

Discussed on

🔗 Relativistic kill vehicle

🔗 Physics 🔗 Firearms

A projectile is any object thrown into space (empty or not) by the exertion of a force. Although any object in motion through space (for example a thrown baseball) may be called a projectile, the term more commonly refers to a ranged weapon. Mathematical equations of motion are used to analyze projectile trajectory. An object projected at an angle to the horizontal has both the vertical and horizontal components of velocity. The vertical component of the velocity on the y-axis given as Vy=USin(teta) while the horizontal component of the velocity Vx=UCos(teta). There are various terms used in projectiles at specific angle teta 1. Time to reach maximum height. It is symbolized as (t), which is the time taken for the projectile to reach the maximum height from the plane of projection. Mathematically, it is given as t=USin(teta)/g Where g=acceleration due to gravity(app 9.81m/s²) U= initial velocity (m/s) teta= angle made by the projectile with the horizontal axis.

2. Time of flight (T): this is the total time taken for the projectile to fall back to the same plane from which it was projected. Mathematically it is given as T=2USin(teta)/g

3. Maximum Height (H): this is the maximum height attained by the projectile OR the maximum displacement on the vertical axis(y-axis) covered by the projectile. It is given as H= U²Sin²(teta)/2g

4. Range(R): The Range of a projectile is the horizontal distance covered (on the x-axis) by the projectile. Mathematically, R= U²Sin2(teta)/g. The Range is maximum when angle teta= 45° I.e Sin2(teta)=1.

Discussed on

🔗 Airglow

🔗 Physics 🔗 Astronomy

Airglow (also called nightglow) is a faint emission of light by a planetary atmosphere. In the case of Earth's atmosphere, this optical phenomenon causes the night sky never to be completely dark, even after the effects of starlight and diffused sunlight from the far side are removed.

Discussed on

🔗 Why the Z boson had a different mass at different times of day.

🔗 Physics

The Large Electron–Positron Collider (LEP) was one of the largest particle accelerators ever constructed.

It was built at CERN, a multi-national centre for research in nuclear and particle physics near Geneva, Switzerland. LEP collided electrons with positrons at energies that reached 209 GeV. It was a circular collider with a circumference of 27 kilometres built in a tunnel roughly 100 m (300 ft) underground and passing through Switzerland and France. LEP was used from 1989 until 2000. Around 2001 it was dismantled to make way for the Large Hadron Collider, which re-used the LEP tunnel. To date, LEP is the most powerful accelerator of leptons ever built.

Discussed on

🔗 There's Plenty of Room at the Bottom (1959)

🔗 Physics 🔗 Physics/Publications

"There's Plenty of Room at the Bottom: An Invitation to Enter a New Field of Physics" was a lecture given by physicist Richard Feynman at the annual American Physical Society meeting at Caltech on December 29, 1959. Feynman considered the possibility of direct manipulation of individual atoms as a more powerful form of synthetic chemistry than those used at the time. Although versions of the talk were reprinted in a few popular magazines, it went largely unnoticed and did not inspire the conceptual beginnings of the field. Beginning in the 1980s, nanotechnology advocates cited it to establish the scientific credibility of their work.

Discussed on

🔗 Electret

🔗 Physics

An electret (formed of as a portmanteau of electr- from "electricity" and -et from "magnet") is a dielectric material that has a quasi-permanent electric charge or dipole polarisation. An electret generates internal and external electric fields, and is the electrostatic equivalent of a permanent magnet. Although Oliver Heaviside coined this term in 1885, materials with electret properties were already known to science and had been studied since the early 1700s. One particular example is the electrophorus, a device consisting of a slab with electret properties and a separate metal plate. The electrophorus was originally invented by Johan Carl Wilcke in Sweden and again by Alessandro Volta in Italy.

The name derives from "electron" and "magnet"; drawing analogy to the formation of a magnet by alignment of magnetic domains in a piece of iron. Historically, electrets were made by first melting a suitable dielectric material such as a polymer or wax that contains polar molecules, and then allowing it to re-solidify in a powerful electrostatic field. The polar molecules of the dielectric align themselves to the direction of the electrostatic field, producing a dipole electret with a permanent electrostatic bias. Modern electrets are usually made by embedding excess charges into a highly insulating dielectric, e.g. by means of an electron beam, corona discharge, injection from an electron gun, electric breakdown across a gap, or a dielectric barrier.

Discussed on

🔗 Poincaré Recurrence Theorem

🔗 Physics 🔗 Systems 🔗 Systems/Dynamical systems

In mathematics and physics, the Poincaré recurrence theorem states that certain dynamical systems will, after a sufficiently long but finite time, return to a state arbitrarily close to (for continuous state systems), or exactly the same as (for discrete state systems), their initial state.

The Poincaré recurrence time is the length of time elapsed until the recurrence. This time may vary greatly depending on the exact initial state and required degree of closeness. The result applies to isolated mechanical systems subject to some constraints, e.g., all particles must be bound to a finite volume. The theorem is commonly discussed in the context of ergodic theory, dynamical systems and statistical mechanics. Systems to which the Poincaré recurrence theorem applies are called conservative systems.

The theorem is named after Henri Poincaré, who discussed it in 1890 and proved by Constantin Carathéodory using measure theory in 1919.

Discussed on

🔗 Gravity Probe B

🔗 Spaceflight 🔗 Physics 🔗 Physics/relativity

Gravity Probe B (GP-B) was a satellite-based experiment to test two unverified predictions of general relativity: the geodetic effect and frame-dragging. This was to be accomplished by measuring, very precisely, tiny changes in the direction of spin of four gyroscopes contained in an Earth-orbiting satellite at 650 km (400 mi) altitude, crossing directly over the poles.

The satellite was launched on 20 April 2004 on a Delta II rocket. The spaceflight phase lasted until ; Its aim was to measure spacetime curvature near Earth, and thereby the stress–energy tensor (which is related to the distribution and the motion of matter in space) in and near Earth. This provided a test of general relativity, gravitomagnetism and related models. The principal investigator was Francis Everitt.

Initial results confirmed the expected geodetic effect to an accuracy of about 1%. The expected frame-dragging effect was similar in magnitude to the current noise level (the noise being dominated by initially unmodeled effects due to nonuniform coatings on the gyroscopes). Work continued to model and account for these sources of error, thus permitting extraction of the frame-dragging signal. By , the frame-dragging effect had been confirmed to within 15% of the expected result, and the NASA report indicated that the geodetic effect was confirmed to be better than 0.5%.

In an article published in the journal Physical Review Letters in , the authors reported analysis of the data from all four gyroscopes results in a geodetic drift rate of −6601.8±18.3 mas/yr and a frame-dragging drift rate of −37.2±7.2 mas/yr, in good agreement with the general relativity predictions of −6606.1±0.28% mas/yr and −39.2±0.19% mas/yr, respectively.

Discussed on

🔗 Binaural beats

🔗 Physics 🔗 Physics/Acoustics

In acoustics, a beat is an interference pattern between two sounds of slightly different frequencies, perceived as a periodic variation in volume whose rate is the difference of the two frequencies.

With tuning instruments that can produce sustained tones, beats can be readily recognized. Tuning two tones to a unison will present a peculiar effect: when the two tones are close in pitch but not identical, the difference in frequency generates the beating. The volume varies like in a tremolo as the sounds alternately interfere constructively and destructively. As the two tones gradually approach unison, the beating slows down and may become so slow as to be imperceptible. As the two tones get further apart, their beat frequency starts to approach the range of human pitch perception, the beating starts to sound like a note, and a combination tone is produced. This combination tone can also be referred to as a missing fundamental, as the beat frequency of any two tones is equivalent to the frequency of their implied fundamental frequency.

Discussed on