Friday, November 30, 2012

Taipei 101

Taipei 101
Taipei 101 (Chinese: 台北101 / 臺北101), formerly known as the Taipei World Financial Center, is a landmark skyscraper located in Xinyi District, Taipei, Taiwan. The building ranked officially as the world's tallest from 2004 until the opening of the Burj Khalifa in Dubai in 2010. In July 2011, the building was awarded LEED Platinum certification, the highest award in the Leadership in Energy and Environmental Design (LEED) rating system and became the tallest and largest green building in the world.Taipei 101 was designed by C.Y. Lee & partners and constructed primarily by KTRT Joint Venture. The tower has served as an icon of modern Taiwan ever since its opening, and received the 2004 Emporis Skyscraper Award. Fireworks launched from Taipei 101 feature prominently in international New Year's Eve broadcasts and the structure appears frequently in travel literature and international media.
Taipei 101 comprises 101 floors above ground and 5 floors underground. The building was architecturally created as a symbol of the evolution of technology and Asian tradition. Its postmodernist approach to style incorporates traditional design elements and gives them modern treatments. The tower is designed to withstand typhoons and earthquakes. A multi-level shopping mall adjoining the tower houses hundreds of fashionable stores, restaurants and clubs.
Taipei 101 is owned by the Taipei Financial Center Corporation (TFCC) and managed by the International division of Urban Retail Properties Corporation based in Chicago. The name originally planned for the building, Taipei World Financial Center, until 2003, was derived from the name of the owner. The original name in Chinese was literally, Taipei International Financial Center (Chinese: 臺北國際金融中心).
Taipei 101
台北101

Taipei 101

Record height
Tallest in the world from 2004 to 2010
Preceded by Petronas Towers
Surpassed by Burj Khalifa
General information
Type Mixed use: communication, conference, fitness center, library, observation, office, restaurant, retail
Location Xinyi District, Taipei, Republic of China
Coordinates 25°2′1″N 121°33′54″E
Construction started 1999
Completed 2004
Opening December 31, 2004
Cost NT$ 58 billion
(US$ 1.80 billion)
Height
Architectural 509 m (1,669.9 ft)
Roof 449.2 m (1,473.8 ft)
Top floor 439 m (1,440.3 ft)
Observatory 391.8 m (1,285.4 ft)
Technical details
Floor count 101 (+5 basement floors)
Floor area 193,400 m2 (2,081,700 sq ft)
Elevators 61 Toshiba/KONE elevators, including double-deck shuttles and 2 high speed observatory elevators)
Design and construction
Owner Taipei Financial Center Corporation
Management Urban Retail Properties Co.
Architect C.Y. Lee & partners
Structural engineer Thornton Tomasetti
Main contractor KTRT Joint Venture
Website
taipei-101.com.tw

The Biography Of John Smeaton (Father Of Civil Engineering)

John Smeaton (Father Of Civil Engineering)
John Smeaton, FRS, (8 June 1724 – 28 October 1792) was an English civil engineer responsible for the design of bridges, canals, harbours and lighthouses. He was also a capable mechanical engineer and an eminent physicist. Smeaton was the first self-proclaimed civil engineer, and often regarded as the "father of civil engineering".
He was associated with the Lunar Society.
John Smeaton

Portrait of John Smeaton, with the Eddystone Lighthouse in the background
Born 8 June 1724
Austhorpe, Leeds, England
Died 28 October 1792 (aged 68)
Austhorpe, Leeds, England
Nationality British
Occupation Civil engineer

Law and physics

Smeaton was born in Austhorpe, Leeds, England. After studying at Leeds Grammar School he joined his father's law firm, but left to become a mathematical instrument maker (working with Henry Hindley), developing, among other instruments, a pyrometer to study material expansion and a whirling speculum or horizontal top (a maritime navigation aid).
He was elected a Fellow of the Royal Society in 1753, and in 1759 won the Copley Medal for his research into the mechanics of waterwheels and windmills. His 1759 paper "An Experimental Enquiry Concerning the Natural Powers of Water and Wind to Turn Mills and Other Machines Depending on Circular Motion" addressed the relationship between pressure and velocity for objects moving in air (Smeaton noted that the table doing so was actually contributed by "my friend Mr Rouse" "an ingenious gentleman of Harborough, Leicestershire" and calculated on the basis of Rouse's experiments), and his concepts were subsequently developed to devise the 'Smeaton Coefficient'.


Over the period 1759-1782 he performed a series of further experiments and measurements on waterwheels that led him to support and champion the vis viva theory of German Gottfried Leibniz, an early formulation of conservation of energy. This led him into conflict with members of the academic establishment who rejected Leibniz's theory, believing it inconsistent with Sir Isaac Newton's conservation of momentum.

Civil engineering

Recommended by the Royal Society, Smeaton designed the third Eddystone Lighthouse (1755–59). He pioneered the use of 'hydraulic lime' (a form of mortar which will set under water) and developed a technique involving dovetailed blocks of granite in the building of the lighthouse. His lighthouse remained in use until 1877 when the rock underlying the structure's foundations had begun to erode; it was dismantled and partially rebuilt at Plymouth Hoe where it is known as Smeaton's Tower. He is important in the history, rediscovery of, and development of modern cement, because he identified the compositional requirements needed to obtain "hydraulicity" in lime; work which led ultimately to the invention of Portland cement.
Deciding that he wanted to focus on the lucrative field of civil engineering, he commenced an extensive series of commissions, including:
Because of his expertise in engineering, Smeaton was called to testify in court for a case related to the silting-up of the harbour at Wells-next-the-Sea in Norfolk in 1782: he is considered to be the first expert witness to appear in an English court. He also acted as a consultant on the disastrous 63-year-long New Harbour at Rye, designed to combat the silting of the port of Winchelsea. The project is now known informally as "Smeaton's Harbour", but despite the name his involvement was limited and occurred more than 30 years after work on the harbour commenced.

Difficulties Of Interstellar Space Travel

The difficulties of interstellar space travel

The main challenge facing interstellar travel is the vast distances that have to be covered. This means that a very great speed and/or a very long travel time is needed. The time it takes with most realistic propulsion methods would be from decades to millennia. Hence an interstellar ship would be much more severely exposed to the hazards found in interplanetary travel, including vacuum, radiation, weightlessness, and micrometeoroids. The long travel times make it difficult to design manned missions. The fundamental limits of space-time present another challenge.Furthermore, it is difficult to foresee interstellar trips being justified for conventional economic reasons.

Required energy

A significant factor contributing to the difficulty is the energy which must be supplied to obtain a reasonable travel time. A lower bound for the required energy is the kinetic energy K = ½ mv2 where m is the final mass. If deceleration on arrival is desired and cannot be achieved by any means other than the engines of the ship, then the required energy at least doubles, because the energy needed to halt the ship equals the energy needed to accelerate it to travel speed.
The velocity for a manned round trip of a few decades to even the nearest star is thousands of times greater than those of present space vehicles. This means that due to the square law,millions of times as much energy is required. Accelerating one ton to one-tenth of the speed of light requires at least 450 PJ or 4.5 ×1017 J or 125 billion kWh, not accounting for losses. This energy has to be carried along,as solar panels do not work far from the Sun and other stars.
There is some belief that the magnitude of this energy may make interstellar travel impossible. It has been reported that at the 2008 Joint Propulsion Conference, where future space propulsion challenges were discussed and debated, a conclusion was reached that it was improbable that humans would ever explore beyond the Solar System.Brice N. Cassenti, an associate professor with the Department of Engineering and Science at Rensselaer Polytechnic Institute, stated “At least 100 times the total energy output of the entire world [in a given year] would be required for the voyage (to Alpha Centauri)”

Interstellar medium

A major issue with traveling at extremely high speeds is that interstellar dust and gas may cause considerable damage to the craft, due to the high relative speeds and large kinetic energies involved. Various shielding methods to mitigate this problem have been proposed.Larger objects (such as macroscopic dust grains) are far less common, but would be much more destructive. The risks of impacting such objects, and methods of mitigating these risks, have not been adequately assessed.

Travel time

It can be argued that an interstellar mission which cannot be completed within 50 years should not be started at all. Instead, assuming that a civilization is still on an increasing curve of propulsion system velocity, not yet having reached the limit, the resources should be invested in designing a better propulsion system. This is because a slow spacecraft would probably be passed by another mission sent later with more advanced propulsion.On the other hand, Andrew Kennedy has shown that if one calculates the journey time to a given destination as the rate of travel speed derived from growth (even exponential growth) increases, there is a clear minimum in the total time to that destination from now.Voyages undertaken before the minimum will be overtaken by those who leave at the minimum, while those who leave after the minimum will never overtake those who left at the minimum.
One argument against the stance of delaying a start until reaching fast propulsion system velocity is that the various other non-technical problems that are specific to long-distance travel at considerably higher speed (such as interstellar particle impact, possible dramatic shortening of average human life span during extended space residence, etc.) may remain obstacles that take much longer time to resolve than the propulsion issue alone, assuming that they can even be solved eventually at all. A case can therefore be made for starting a mission without delay, based on the concept of an achievable and dedicated but relatively slow interstellar mission using the current technological state-of-the-art and at relatively low cost, rather than banking on being able to solve all problems associated with a faster mission without having a reliable time frame for achievability of such.
Intergalactic travel involves distances about a million-fold greater than interstellar distances, making it radically more difficult than even interstellar travel.

Interstellar distances

Astronomical distances are often measured in the time it would take a beam of light to travel between two points. Light in a vacuum travels approximately 300,000 kilometers per second or 186,000 miles per second.
The distance from Earth to the Moon is 1.3 light-seconds. With current spacecraft propulsion technologies, a craft can cover the distance from the Earth to the Moon in around eight hours (New Horizons). That means light travels approximately thirty thousand times faster than current spacecraft propulsion technologies. The distance from Earth to other planets in the solar system ranges from three light-minutes to about four light-hours. Depending on the planet and its alignment to Earth, for a typical unmanned spacecraft these trips will take from a few months to a little over a decade.
The nearest known star to the Sun is Proxima Centauri, which is 4.23 light-years away. However, there may be undiscovered brown dwarf systems that are closer. The fastest outward-bound spacecraft yet sent, Voyager 1, has covered 1/600th of a light-year in 30 years and is currently moving at 1/18,000th the speed of light. At this rate, a journey to Proxima Centauri would take 72,000 years. Of course, this mission was not specifically intended to travel fast to the stars, and current technology could do much better. The travel time could be reduced to a few millennia using lightsails, or to a century or less using nuclear pulse propulsion. A better understanding of the vastness of the interstellar distance to one of the closest stars to the sun, Alpha Centauri A (a Sun-like star), can be obtained by scaling down the Earth-Sun distance (~150,000,000 km) to one meter. On this scale the distance to Alpha Centauri A would still be 271 kilometers or about 169 miles.
However, more speculative approaches to interstellar travel offer the possibility of circumventing these difficulties. Special relativity offers the possibility of shortening the travel time: if a starship with sufficiently advanced engines could reach velocities approaching the speed of light, relativistic time dilation would make the voyage much shorter for the traveler. However, it would still take many years of elapsed time as viewed by the people remaining on Earth, and upon returning to Earth, the travelers would find that far more time had elapsed on Earth than had for them. (For more on this effect, see twin paradox.)
General relativity offers the theoretical possibility that faster-than-light travel may be possible without violating fundamental laws of physics, for example, through wormholes, although it is still debated whether this is possible, in part, because of causality concerns. Proposed mechanisms for faster-than-light travel within the theory of General Relativity require the existence of exotic matter.

Communications

The round-trip delay time is the minimum time between an observation by the probe and the moment the probe can receive instructions from Earth reacting to the observation. Given that information can travel no faster than the speed of light, this is for the Voyager 1 about 32 hours, near Proxima Centauri it would be 8 years. Faster reaction would have to be programmed to be carried out automatically. Of course, in the case of a manned flight the crew can respond immediately to their observations. However, the round-trip delay time makes them not only extremely distant from but, in terms of communication, also extremely isolated from Earth (analogous to how past long distance explorers were similarly isolated before the invention of the electrical telegraph).
Interstellar communication is still problematic — even if a probe could reach the nearest star, its ability to communicate back to Earth would be difficult given the extreme distance.

Prime targets for interstellar travel

There are 59 known stellar systems within 20 light years from the Sun, containing 81 visible stars. The following could be considered prime targets for interstellar missions:
Stellar system Distance (ly) Remarks
Alpha Centauri 4.3 Closest system. Three stars (G2, K1, M5). Component A similar to our sun (a G2 star). Alpha Centauri B has one confirmed planet.
Barnard's Star 6.0 Small, low luminosity M5 red dwarf. Next closest to Solar System.
Sirius 8.7 Large, very bright A1 star with a white dwarf companion.
Epsilon Eridani 10.8 Single K2 star slightly smaller and colder than the Sun. Has two asteroid belts, might have a giant and one much smaller planet, and may possess a solar system type planetary system.
Tau Ceti 11.8 Single G8 star similar to the Sun. High probability of possessing a solar system type planetary system.
Gliese 581 20.3 Multiple planet system. The unconfirmed exoplanet Gliese 581 g and the confirmed exoplanet Gliese 581 d are in the star's habitable zone.
Existing and near-term astronomical technology is capable of finding planetary systems around these objects, increasing their potential for exploration.

Manned missions



The mass of any craft capable of carrying humans would inevitably be substantially larger than that necessary for an unmanned interstellar probe. For instance, the first space probe, Sputnik 1, had a payload of 83.6 kg, while spacecraft to carry a living passenger (Laika the dog), Sputnik 2, had a payload six times that at 508.3 kg. This underestimates the difference in the case of interstellar missions, given the vastly greater travel times involved and the resulting necessity of a closed-cycle life support system. As technology continues to advance, combined with the aggregate risks and support requirements of manned interstellar travel, the first interstellar missions are unlikely to carry earthly life forms.






Interstellar Space Travel

Interstellar Space Travel
Interstellar space travel is manned or unmanned travel between stars. The concept of interstellar travel in starships is a staple of science fiction. Interstellar travel is conceptually much more difficult than interplanetary travel. Intergalactic travel, or travel between different galaxies, would be even more difficult.
Many scientific papers have been published about related concepts. Given sufficient travel time and engineering work, both unmanned and sleeper ship interstellar travel seem possible, though both present considerable technological and economic challenges that are unlikely to be met in the near future, particularly for manned probes. NASA, ESA and other space agencies have been engaging in research into these topics for several years, and have accumulated a number of theoretical approaches.
Energy requirements appear to make interstellar travel impractical for generation ships, but less so for heavily shielded sleeper ships.

Carbon Fibre Reinforced Polymer (CFRP)

Carbon Fibre Reinforced Polymer
Carbon-fiber-reinforced polymer or carbon-fiber-reinforced plastic (CFRP or CRP or often simply carbon fiber), is an extremely strong and light fiber-reinforced polymer which contains carbon fibers. The polymer is most often epoxy, but other polymers, such as polyester, vinyl ester or nylon, are sometimes used. The composite may contain other fibers, such as Kevlar, aluminium, or glass fibers, as well as carbon fiber. The strongest and most expensive of these additives, carbon nanotubes, are contained in some primarily polymer baseball bats, car parts and even golf clubs,where economically viable.
Although carbon fiber can be relatively expensive, it has many applications in aerospace and automotive fields, such as Formula One. The compound is also used in sailboats, modern bicycles, and motorcycles, where its high strength-to-weight ratio and very good rigidity is of importance. Improved manufacturing techniques are reducing the costs and time to manufacture, making it increasingly common in small consumer goods as well, such as certain ThinkPads since the 600 series, tripods, fishing rods, hockey sticks, paintball equipment, archery equipment, tent poles, racquet frames, stringed instrument bodies, drum shells, golf clubs, helmets used as a paragliding accessory and pool/billiards/snooker cues.
The material is also referred to as graphite-reinforced polymer or graphite fiber-reinforced polymer (GFRP is less common, as it clashes with glass-(fiber)-reinforced polymer). In product advertisements, it is sometimes referred to simply as graphite fiber for short.

Tuesday, November 27, 2012

Nano Technology (Nano Tech)

Nano Technology
Nanotechnology (sometimes shortened to "nanotech") is the manipulation of matter on an atomic and molecular scale. Generally, nanotechnology works with materials, devices, and other structures with at least one dimension sized from 1 to 100 nanometres. Quantum mechanical effects are important at this quantum-realm scale. With a variety of potential applications, nanotechnology is a key technology for the future and governments have invested billions of dollars in its research. Through its National Nanotechnology Initiative, the USA has invested 3.7 billion dollars. The European Union has invested 1.2 billion and Japan 750 million dollars.
NANOTECHNOLOGY
Nanotechnology is very diverse, ranging from extensions of conventional device physics to completely new approaches based upon molecular self-assembly, from developing new materials with dimensions on the nanoscale to direct control of matter on the atomic scale. Nanotechnology entails the application of fields of science as diverse as surface science, organic chemistry, molecular biology, semiconductor physics, microfabrication, etc.
Scientists currently debate the future implications of nanotechnology. Nanotechnology may be able to create many new materials and devices with a vast range of applications, such as in medicine, electronics, biomaterials and energy production. On the other hand, nanotechnology raises many of the same issues as any new technology, including concerns about the toxicity and environmental impact of nanomaterials,and their potential effects on global economics, as well as speculation about various doomsday scenarios. These concerns have led to a debate among advocacy groups and governments on whether special regulation of nanotechnology is warranted.

Design Of Lockheed Martin F-35 Lightning II

Design
The F-35 appears to be a smaller, slightly more conventional, single-engine sibling of the sleeker, twin-engine Lockheed Martin F-22 Raptor, and indeed drew elements from it. The exhaust duct design was inspired by the General Dynamics Model 200 design, which was proposed for a 1972 supersonic VTOL fighter requirement for the Sea Control Ship.For specialized development of the F-35B STOVL variant, Lockheed consulted with the Yakovlev Design Bureau, purchasing design data from their development of the Yakovlev Yak-141 "Freestyle". Although several experimental designs have been built and tested since the 1960s including the navy's unsuccessful Rockwell XFV-12, the F-35B is to be the first operational supersonic, STOVL stealth fighter.
The F-35 has a maximum speed of over Mach 1.6. With a maximum takeoff weight of 60,000 lb (27,000 kg),the Lightning II is considerably heavier than the lightweight fighters it replaces. In empty and maximum gross weights, it more closely resembles the single-seat, single-engine Republic F-105 Thunderchief, which was the largest single-engine fighter of the Vietnam war era. The F-35's modern engine delivers over 60 percent more thrust in an aircraft of the same weight so that in thrust to weight and wing loading it is much closer to a comparably equipped F-16.
Acquisition deputy to the assistant secretary of the air force, Lt. Gen. Mark D. "Shack" Shackelford has said that the F-35 is designed to be America's "premier surface-to-air missile killer and is uniquely equipped for this mission with cutting edge processing power, synthetic aperture radar integration techniques, and advanced target recognition."

Some improvements over current-generation fighter aircraft are:
  • Durable, low-maintenance stealth technology, using structural fiber mat instead of the high-maintenance coatings of legacy stealth platforms;
  • Integrated avionics and sensor fusion that combine information from off- and on-board sensors to increase the pilot's situational awareness and improve target identification and weapon delivery, and to relay information quickly to other command and control (C2) nodes;
  • High speed data networking including IEEE 1394b and Fibre Channel.(Fibre Channel is also used on Boeing's Super Hornet.)
  • The Autonomic Logistics Global Sustainment (ALGS), Autonomic Logistics Information System (ALIS) and Computerized Maintenance Management System (CMMS) are claimed to help ensure aircraft uptime with minimal maintenance manpower.However the Pentagon has moved to open the sustainment for competitive bidding by other companies.This was after Lockheed admitted that instead of costing twenty percent less than the F-16 per flight hour, the F-35 would actually cost twelve percent more.The USMC have implemented a workaround for a cyber vulnerability in the system.
  • Electrohydrostatic actuators run by a power-by-wire flight-control system.
Lockheed Martin claims the F-35 is intended to have close and long-range air-to-air capability second only to that of the F-22 Raptor.The company has suggested that the F-35 could also replace the USAF's F-15C/D fighters in the air superiority role and the F-15E Strike Eagle in the ground attack role, but it does not have the range or payload of either F-15 model.The F-35A does carry a similar air-to-air armament as the conceptual Boeing F-15SE Silent Eagle when both aircraft are configured for low observable operations and has over 80 percent of the larger aircraft's combat radius, under those conditions.
Lockheed Martin has said that the F-35 has the advantage over the F-22 in basing flexibility and "advanced sensors and information fusion".
Structural composites in the F-35 are 35% of the airframe weight (up from 25% in the F-22). The majority of these are bismaleimide (BMI) and composite epoxy material.However the F-35 will be the first mass produced aircraft to include structural nanocomposites, namely carbon nanotube reinforced epoxy.
The F-35 program has learned from the corrosion problems that the F-22 had when it was first introduced in 2005. The F-35 uses a gap filler that causes less galvanic corrosion to the skin, is designed with fewer gaps in its skin that require gap filler, and has better drainage.
A United States Navy study found that the F-35 will cost 30 to 40 percent more to maintain than current jet fighters.A Pentagon study found that it may cost $1 trillion to maintain the entire fleet over its lifetime.
The relatively short 35 foot wingspan of the A and B variants is set by the F-35B's requirement to fit inside the Navy's current amphibious assault ship elevators. The F-35C's longer wing is considered to be more fuel efficient.

Engines

The F-35's main engine is the Pratt & Whitney F135. The General Electric/Rolls-Royce F136 was under development as an alternative engine until December 2011 when the manufacturers canceled work on it. The F135/F136 engines are not designed to supercruise in the F-35,but the F-35 can achieve a limited supercruise of Mach 1.2 for 150 miles.The STOVL versions of both power plants use the Rolls-Royce LiftSystem, designed by Lockheed Martin and developed to production by Rolls-Royce. This system is more like the Russian Yak-141 and German VJ 101D/E than the preceding generation of STOVL designs,such as the Harrier Jump Jet in which all of the lifting air went through the main fan of the Rolls-Royce Pegasus engine.
The Lift System is composed of a lift fan, drive shaft, two roll posts and a "Three Bearing Swivel Module" (3BSM).The 3BSM is a thrust vectoring nozzle which allows the main engine exhaust to be deflected downward at the tail of the aircraft. The lift fan is near the front of the aircraft and provides a counterbalancing thrust using two counter-rotating blisks.It is powered by the engine's low-pressure (LP) turbine via a drive shaft and gearbox. Roll control during slow flight is achieved by diverting unheated engine bypass air through wing-mounted thrust nozzles called Roll Posts.Like lift engines, the added lift fan machinery increases payload capacity during vertical flight, but is dead weight during horizontal flight. The cool exhaust of the fan also reduces the amount of hot, high-velocity air that is projected downward during vertical take off, which can damage runways and aircraft carrier decks.
To date, F136 funding has come at the expense of other parts of the program, reducing the number of aircraft built and increasing their costs.The F136 team has claimed that their engine has a greater temperature margin which may prove critical for VTOL operations in hot, high altitude conditions.
Pratt & Whitney is also testing higher thrust versions of the F135, partly in response to GE's claims that the F136 is capable of producing more thrust than the 43,000 lbf (190 kN) supplied by early F135s. The F135 has demonstrated a maximum thrust of over 50,000 lbf (220 kN) during testing.The F-35's Pratt & Whitney F135 is the most powerful engine ever installed in a fighter aircraft.
The F135 is the second (radar) stealthy afterburning jet engine and like the Pratt & Whitney F119 from which it was derived, has suffered from pressure pulsations in the afterburner at low altitude and high speed or "screech". In both cases this problem was fixed during development of the fighter program.
Turbine bearing health in the engine will be monitored with thermoelectric powered wireless sensors.

Lockheed Martin F-35 Lightning II

Lockheed Martin F-35 Lightning II
The Lockheed Martin F-35 Lightning II is a family of single-seat, single-engine, fifth generation multirole fighters under development to perform ground attack, reconnaissance, and air defense missions with stealth capability. The F-35 has three main models; the F-35A is a conventional takeoff and landing variant, the F-35B is a short take off and vertical-landing variant, and the F-35C is a carrier-based variant.
The F-35 is descended from the X-35, the product of the Joint Strike Fighter (JSF) program. JSF development is being principally funded by the United States, with the United Kingdom and other partner governments providing additional funding. The partner nations are either NATO members or close U.S. allies. It is being designed and built by an aerospace industry team led by Lockheed Martin. The F-35 carried out its first flight on 15 December 2006.
The United States plans to buy a total of 2,443 aircraft to provide the bulk of its tactical airpower for the U.S. Air Force, Marine Corps and Navy over the coming decades. The United Kingdom, Italy, Netherlands, Australia, Canada, Norway, Denmark, Turkey, Israel and Japan are part of the development program and may equip their air services with the F-35.                                                            Reference:Wikipedia.Org

Technology -The Term

Technology
Technology is the making, modification, usage, and knowledge of tools, machines, techniques, crafts, systems, methods of organization, in order to solve a problem, improve a preexisting solution to a problem, achieve a goal, handle an applied input/output relation or perform a specific function. It can also refer to the collection of such tools, machinery, modifications, arrangements and procedures. Technologies significantly affect human as well as other animal species' ability to control and adapt to their natural environments. The word technology comes from Greek τεχνολογία (technología); from τέχνη (téchnē), meaning "art, skill, craft", and -λογία (-logía), meaning "study of-".[1] The term can either be applied generally or to specific areas: examples include construction technology, medical technology, and information technology. The human species' use of technology began with the conversion of natural resources into simple tools. The prehistorical discovery of the ability to control fire increased the available sources of food and the invention of the wheel helped humans in travelling in and controlling their environment. Recent technological developments, including the printing press, the telephone, and the Internet, have lessened physical barriers to communication and allowed humans to interact freely on a global scale. However, not all technology has been used for peaceful purposes; the development of weapons of ever-increasing destructive power has progressed throughout history, from clubs to nuclear weapons. Technology has affected society and its surroundings in a number of ways. In many societies, technology has helped develop more advanced economies (including today's global economy) and has allowed the rise of a leisure class. Many technological processes produce unwanted by-products, known as pollution, and deplete natural resources, to the detriment of the Earth and its environment. Various implementations of technology influence the values of a society and new technology often raises new ethical questions. Examples include the rise of the notion of efficiency in terms of human productivity, a term originally applied only to machines, and the challenge of traditional norms. Philosophical debates have arisen over the present and future use of technology in society, with disagreements over whether technology improves the human condition or worsens it. Neo-Luddism, anarcho-primitivism, and similar movements criticise the pervasiveness of technology in the modern world, opining that it harms the environment and alienates people; proponents of ideologies such as transhumanism and techno-progressivism view continued technological progress as beneficial to society and the human condition. Indeed, until recently, it was believed that the development of technology was restricted only to human beings, but recent scientific studies indicate that other primates and certain dolphin communities have developed simple tools and learned to pass their knowledge to other generations.