Sunday, March 29, 2020

Who invented Bullet Train?


Hideo Shima, 1964



Who invented Bullet Train ? The first successful steam locomotive, George Stephenson’s Rocket, could manage 20 km an hour. Over the next 150 years steady if incremental improvements in engines, tracks and railway operations, saw speeds reach five or eight times that of the Rocket.
Only France and Japan were thinking seriously about new designs and systems that would go faster, Japans’s Shinkansen popularly called the Bullet Train because of its shape, began running between Tokyo and Osaka in 1964, in time for the Tokyo Olympics, Seventeen years later, but still well ahead of anyone else, the first French TGV began plying between Paris and Lyon.

History of Bullet Train:

Not everything about these trains was new. The newly built rolling stock was streamlined to cut wind resistance, but was still powered by electricity from overhead wires, and still carried by steel wheels on steel rails over conventional (if often newly laid) track bed. The French Japanese were simply pushing the existing technology as far as it would go.


The speeds were impressive, averaging over 250 km per hour over a journey of several hours, and reaching over 500 km per hour in trials. Travel was comfortable, departures punctual, environmental consequences minimal and the safety record impressive, especially once the tracks were fenced off. Similar train lines began to appear in other countries, and the lines could compete with air travel over short distances, being cheaper, taking less time and consuming less energy.

Was that enough? Some thought not and were already experimenting with the next generation technology, a major leap toward magnetic levitation or ‘mag-lev’. No longer would trains run on rails. They would float above them, held up, guided and even driven forward by the interaction of electromagnetic fields between the track and the train. With no motors, the trains would be much lighter and more economical to run. No ‘mag-lev’ train is yet in service, but the Japanese active since 1970, have reached nearly 600 km per hour in trials. Even higher speeds, approaching 1000 km per hour, would be possible with the train running through an evacuated tube.

Saturday, March 28, 2020

Who invented Digital Camera?


Steven Sasson, 1975


The history of photography goes back about 200 years. The many generations of cameras all had one thing in common: images were recorded on glass plates or plastic film using chemical emulsions that were sensitive to light. They were made visible and permanent by chemical processing. Other than the briefly successful Polaroid process no camera took a picture you could look at immediately. Furthermore, the only way you could send a good quality photo to someone was as a hard copy printed on paper.

The comparison with today’s electronic or digital cameras is startling. You can view the image as soon as you have snapped it, and you can send it down a phone line or cable, confident it will arrive looking good. And processing is easy. You can play with images, enlarge them, change the framing or color balance; things that previously look hours in the dark room.

History of invention:

This world of digital imaging is very new, for the everyday photographer at least. The first affordable electronic cameras reached the market only in the early 1990s. Their origins lay a few decades further back, and derive much from the development of television. That, too, required technology to convert images into electric currents. In most digital cameras, the key component is the CCD based like the computer chip on semiconductors such as silicon. A CCD is an array of light sensitive cells that accumulate electric charge in proportion to the brightness of the light that falls on them, much as solar cells do. The level of charge on the pixels can be read out one by one, the data stored and processed to recreate the scene.

The first CCDs, built in the 1970s, were crude; with only 10000 pixels (modern cameras boast millions of pixels). The images were correspondingly fuzzy. The first attempt to build one of these into a camera, undertaken by Steven Sasson at Eastman-Kodak in 1973, was equally clunky. The camera weighed 4 kg and took 20 seconds per image. It was only a technical exercise to see if it could be done, not intended for the market.

But the exercise began an inevitable trend, one that will ultimately kill off the business that Kodak pioneered cameras using film. Such cameras are not yet dead, but they live on mostly in one niche, as the cheap, disposable cameras of last resort, used in emergencies or extreme environments such as underwater. These cameras trade off cost and disposability against image quality and just about every other feature of both traditional and digital photography. It is a rather sad farewell.

The evolution of the digital camera has been swift, in line with the rest of information technology. We see the same trends: ever increasing capabilities at ever decreasing cost. Early digital cameras cost $ 10000 or more; that price has shrunk at least 20-fold. Progress has been aided by the development of international standards for storing images in digital code, such as the JPEG format devised by the joint Photography Expert Group. Among other things, this allows images to be ‘compressed’. A great deal of information in an images to be ‘compressed. A great deal of information in an image can be discarded without unacceptable loss of quality, as with DVDs and MP3 players. As a result, images take up much less space in computer memories and can be quickly transmitted by email or over the Internet.

New sorts of data storage, such as ‘flash memory’, let a camera store more pictures, Inkjet printer generate quality copies on demand. Cameras continue to get smaller, now often barely larger than a credit card, able to be secreted in a mobile phone or in hard to reach places. Similar technology has transformed video cameras. Most electronic cameras will now take both still and moving pictures. We have come a very long way in little more than a decade.

The digital camera transformation raises a host of issues. Simplicity and ubiquity (such as in mobile phones) have encouraged some inappropriate use, with invasions of privacy that can verge on assault. The capacity to manipulate images by digitally altering the location and content of individual pixels has damaged our confidence that ‘the camera never lies’. Surveillance in public places is easier and cheaper and therefore more common. This no doubt helps with the maintenance of law and order but asks a question about our right to privacy once we step out of our front doors. Digital cameras are one of the ways we can create and access information freely for ourselves, a capacity central to the information age, open to both use and abuse.

Friday, March 27, 2020

Who invented Electricity?


Michael Faraday, Joseph Henry, 1853



Many see Michael Faraday, the bookbinder’s apprentice (and protégé of Humphry Davy) who rose to head London’s Royal Institution, as the greatest researcher of the 19th century, or perhaps ever. His reputation rests mostly on his prodigious studies of electricity and magnetism over a period of 20 years.

History of Electricity:

He knew of the link the Dane Hans Orsted had found between electricity and magnetism. An electric current in wire is surrounded by magnetic field. In 1831 he asked ‘if electricity can produce magnetism, can magnetism produce electricity?. Anyone can do what he did. He wrapped a wire a number of times around one side of an iron ring, and connected the ends to a battery and switch. He wrapped a second coil around the other side of the ring and connected it to an instrument to report any current that flowed. He hoped to show that a current flowing in one coil would make the current flow in the other, the connection coming via the magnetic field.

Almost nothing happened. When Faraday turned on the current in the first coil, the needle in the galvanometer flicked quickly one way before settling back to zero. When he turned the current off again, the needle flicked the other way and returned to zero. So the magnetic field had to be changing, growing stronger or weaker, before it would make a current flow.

Faraday had found ‘electromagnetic induction’. A changing magnetic field ‘induces’ a current in a nearby wire. The practical consequences were obvious. A machine with magnets and coils of wire constantly moving relative to each other would produce a sustained electric current, as an alternative to using a battery. Faraday’s first ‘dynamo’ spun a copper plate between the poles on a powerful magnet. A current flowed in the disc but energy wastage was high. Still the crucial principle had been demonstrated.

A year later French inventor Hippolyte Pixii replaced the disc with some coils of wire, holding them still and spinning the magnet. This began decades of research to find the best arrangement of coils and magnets, including electromagnets, to convert movement into electricity. Some designs produced a current in only one direction (direct current or DC); other designs produced alternating current (AC), which flowed back and forth.

Similar findings were being made across the Atlantic. Natural philosopher Joseph Henry was a major figure in the growth of American science in the 19th century, with achievements as both a researcher and a statesman of science. In 1846 he left a post at what was later called Princeton University to become the first Secretary of the newly founded Smithsonian Institution in Washington, today one of the world’s great museums and laboratories in science and technology. He also helped found the National Academy of Science, the North American equivalent of the Royal Society.

These roles took him away from his own research but his reputation was already secure, particularly in electricity and magnetism. In the late 1820s he built on the discovery of electromagnetism and made powerful electromagnets by wrapping thousands of loops of current carrying wire around iron bars. Some of these powered the first electric telegraph in America which among other benefits allowed the rapid reporting of weather information and forecasts; Henry was instrumental in setting up a national system for the purpose.

Working independently, Henry built electric motors and discovered electromagnetic induction in the same year as Faraday. He took the matter further, discovering ‘self induction’ before Faraday did. Changing current in a coil of wire makes a changing magnetic field; that field in turn generates another current in the same coil opposing the first. So changing currents (including alternating ones) are impeded or ‘choked’ as they try to pass through a coil of wire. The more rapidly the current changes the more noticeable the effect. ‘Chokes’ or ‘inductors’ remain vital components of electronic circuits today. Henry’s contribution is commemorated in the unit of inductance: the henry.

Faraday and Henry both grasped the use of induction in a ‘transformer’. If two coils are wound on the same piece of iron, the first with a few turns of wire, the second with many, the induced current in the second coil has a much higher ‘voltage’ than the first. So voltages can be ‘stepped up’ but only with alternating currents (AC); steady ‘direct’ current in the first; otherwise energy would be created. Raising the voltage lowers the current, and vice versa, an important consideration in the later ‘Battle of the Current’ as the use of electricity began to spread.

Thursday, March 26, 2020

Who invented Potato chip?


George Crum, Herman Lay, 1853


When it comes to popular snacks, the potato chip or ‘crisp, must be up there with the leaders, Thomas Jefferson, later president of the USA, came to enjoy the French style while on duty as ambassador there in the late 18th century. He bought the recipe home and served the thick cut fried potato slices (not today’s meaning of ‘French fries’) to his guests.

History of Potato chips:

In 1853 Native American chef George Crum had French fried potatoes on the menu at the Sun Moon restaurant at the expensive Saratoga Springs resort in upstate New York. But one diner, reputedly the millionaire banker and New York social icon Cornelius Vanderbilt, did not care for them. Too thick, he declared, and sent them back. Crum prepared a second, thinner serving but the guest was still not satisfied Irate, Crum cut them so fine they went cristp when fried, too thin and hard to be speared with a fork. He expected the diners to be angry, but they were delighted. Proclaiming the browned, paper-thin tidbits delicious, they demanded more.


Crum was onto a winner. He made his invention a speciality of the house, calling them ‘potato crunches’. He was soon packaging them for sale as Saratoga Chips.

Some distance remained to be covered before this New England dinner-time delicacy could be counted a global gastronomic phenomenon. There key events all occurred in the 1920s: the invention of machines to peel and slice the potatoes, previously done laboriously by hand; the first use of waxed paper bags to keep the crisps crisp (plastic film not het being available); and the intervention of Herman Lay. A travelling salesman from the north, he peddled potato chips to storekeepers throughout the American south from the boot of his car. He built a business that linked his name indelibly with the salty snacks, especially once he merged his company with Firto, a Dallas based firm that made corn chips. Frito-Lay is the largest maker of potato chips in the USA and therefore on the planet.

Monday, March 16, 2020

Who invented Bicycles?


Pierre Michaux, 1861


Why did the invention of the bicycle take so long? For 5000 years, wheels had been used in various combinations, including two side by side. But no one had put two wheels in one line and sat between them until the Frenchman de Sivrac around 1690. Yet it did not take off. For one thing, de Sivrac could not steer his machine.

History of Bicycles:

Around 1816 the German inventor Karl Drais marketed his’ running machine’ or ‘Draisine’ often dubbed a ‘hobbyhorse’. Steerable, it had no pedals; the rider had to propel the machine by kicking the ground on either side. It was not a comfortable ride; the wooden wheels had iron rims.


In 1839 the Scotsman Kirkpatrick Macmillan connected the back wheel to pedals with crank, and a rider could easily move faster than they could walk. But the bicycle boom did not come until the French carriage maker Pierre Michaux, asked to repair a Draisine, proposed a pair of pedals fixed to the front wheel, producing the ‘velocipede’. That was 1861.
The machine moved forward by only the circumference of the front wheel at every turn of the pedals, so manufacturers mad the front wheel larger and larger, producing the ‘Penny-farthing’. With a very large wheel, the rider perched on top and a small trailing wheel for balance, these were easy to ride, but tumbles were common. Thomas Stevens rode such a bicycle across the USA.
The propulsion breakthrough also came in the 1880s, with the first ‘bicycle chain’ with metal links to drive the back wheel from pedals. Different sized gearwheels had the back wheel running faster than the pedals; a ‘freewheeling’ mechanism let the rider stop peddling without stopping moving.
With the addition of inflatable rubber tyres in 1888, ball bearings to reduce friction in the works and many experiments with the form of the steel tubes- bicycles such as John Stanley’s Rover gained the form we see today, and unprecedented popularity. People now had the speed and freedom of movement they once had riding horses, till bicycles were overtaken, by the next big thing in personal transport, the automobile.

Saturday, March 14, 2020

Who invented ATM (Automatic Teller Machine)?


John Shepherd Baron & others, 1967


So many people claim to have invented the ATM (Automatic teller machine) that is must be judged simply an idea whose time had come. The first proposals for a ‘hole in the wall’ go back to the late 1930s, but the inventor Luther Simjian never patented his ‘Bankomatic’ and never succeeded in convincing anyone there was really a demand, though Citibank did trial it.

History of ATM Invention:

In 1967 John Shepherd Barron who ran a company called De la Rue Instruments, built machines to automatically dispense cash, including one outside a branch of Barclays Bank in north London. At much the same time, James Good fellow, working with Smith Industries, devised an automatic cash dispenser with some new ideas, including a keypad to nominate the money needed.


At least two other people have a claim, both Americans:  Don Wetzel of the company Product Planning, which made automatic baggage handling equipment, and John D. White, who reputedly built his first ATM in 1973. The argument about who invented what when will probably never be settled to everyone’s satisfaction. Perhaps we can say that the two. Englishmen pioneered ATMs in Britain, and the two Americans got ATMs going on their side of the Atlantic. We can also say that Wetzel at least, and probably several of the others, generated the vision of banking without a teller from the frustration of waiting in a queue, and the equally common experience of needing money when the bank is closed.

ATMs were not an instant success. They took several years to catch on among wary consumers. None of the early ATMs had the continuous connection to a central computer we see today, and through that to bank accounts from which the money could be drawn. That linkage would later make ATMs yet another manifestation of the information revolution. Without a link to an account, banks were wary about granting access to cash from such machines. An early challenge was identification, and there Wetzel seems to have the edge. His machines were operated by ID cards with a magnetic stripe, much like today.

Friday, March 6, 2020

Who invented GPS (Global Positioning System)?


Ivan Getting, 1974


The latest ‘must have’ for your car is a talking roadmap, sitting on your dashboard. From time to time a gentle voice tells you what turn to take to reach your destination. This would be impossible without the Global positioning system (GPS), another achievement of the space age. The Godfather of GPS was Ivan Getting, a research engineer at Raytheon Corporation, which bad been very active in pioneering wartime radar and the first microwave ovens. The first customer was the US military. A new generation of guided missiles was to be fired from railway wagons to achieve security from attack through being moved about. Their handlers needed to know precisely where their weapon was at the time of launch so it could be directed accurately to its target.


During World War II, Getting proposed a network to radio transmitters, each with a precise clock. A receiver at the missile launch site, with a similar clock, would pick up several transmissions simultaneously. By noting how long each signal took to arrive, the missiles’ minders could calculate the missiles’ distance from each of the transmitters and hence their location.

Usage of GPS:

In today’s GPS, the radio transmitters are circling the Earth 20000 kilometers up in precisely known orbits. More than 20 such satellites are aloft at any one time. A GPS receiver uses transmissions from three or four satellites to calculate its position in three dimensions, accurate to a few meters. It is an astounding capability.
The USA launched the first GPS satellites in 1974. For a decade, only their military could use them; civilians gained access in the 1990s. Defense authorizes now access an enhanced system that reputedly provides positions to the nearest centimeter, and is used to guide smart bombs and missiles. The systems cost about $ billion to develop and nearly $1 billion a year to operate and maintain. However, economic benefits are piling up, especially from greater accuracy and safety of navigation of ships, planes and even land craft.

Tuesday, March 3, 2020

Who invented Supercomputer?


Seymour Cray, 1974


All computers have become much faster and more powerful. A computer’s speed can be measured in FLOPS (floating point operations per second), roughly the number of calculations done each second. The earliest computers did a few thousand, the first PCs a few million. Modern desktops manage a few billion, which sounds quick, yet leading edge’ supercomputers’ are a thousand times faster again, and more. In 2006 a machine at a leading US defense laboratory completed an astounding 200 million calculations each second. Upgrades should take it to 500.

History of Supercomputer:



The pioneer was Seymour Cray, whose first machine CRAY-1, reached the market in 1974. Building faster computers raised many issues. Cray figured that a number of smaller computers running side by side would go faster than a single larger machine. To minimize the time taken to shift data from processor to processor, the connections were as short as possible. His most famous designs were cylindrical. All that processing in a small space generated a lot of heat. Lastly the data had to be moved in and out of the computer fast enough not to slow the calculations. As Cray said; ‘Anyone can build a fast processor. What we need is a fast system’.
For several decades Cray-designed supercomputers (a term he did not use) whipped the opposition for speed. Nothing else came close. Cray hit financial turbulence himself later, perhaps not surprising in so risky a business, where one machine could cost $5 million. He was successful, selling 100 CRAY-1s and setting the early pace. Others have now taken over, using ‘massively parallel’ computing. Tens of thousands of processors like the CPU in your desktop run together to generate blinding speed

Benefits:

The demand for speed comes from military agencies, universities and major research laboratories with big, tough calculations to do. Tasks include forecasting the weather (always a rugged one), figuring the shapes of chemical molecules used as medical drugs, simulating how new nuclear weapons or aircraft will perform (much cheaper than a wind tunnel) and cracking codes. By the way, only the top machines can beat the best human chess players.