< PREV | NEXT > | INDEX | GOOGLE | UPDATES | EMAIL | $Donate? | HOME

DayVectors

sep 2005 / last mod sep 2021 / greg goebel

* 22 entries including: global water supply, Ronald Reagan's war on terror, the internet-scale operating system, Norden & Sperry bombsights, voice-IP catching on, storms and global warming, ghost town at Chernobyl product lifecycle management, biofuels reviewed, advanced wristwatches, the rise of hybrid autos, LeTourneau's overland train, US Air Force space warriors, hyper-precise measurement, 18th-century clockwork androids, cranberry agriculture, and time spammer Robby Todino.

banner of the month


[FRI 30 SEP 05] LIQUID TREASURE (4)
[THU 29 SEP 05] VOIP GATHERS MOMENTUM
[WED 28 SEP 05] STORM WARNING
[TUE 27 SEP 05] CHERNOBYL GHOST TOWN
[MON 26 SEP 05] INTERNET-SCALE OPERATING SYSTEM (2)
[FRI 23 SEP 05] LIQUID TREASURE (3)
[THU 22 SEP 05] PRODUCT LIFECYCLE MANAGEMENT
[WED 21 SEP 05] BIOFUELS ON A ROLL
[TUE 20 SEP 05] WRISTWATCH OF TOMORROW
[MON 19 SEP 05] INTERNET-SCALE OPERATING SYSTEM (1)
[FRI 16 SEP 05] LIQUID TREASURE (2)
[THU 15 SEP 05] HYBRIDS ARRIVED
[WED 14 SEP 05] THE OVERLAND TRAIN
[TUE 13 SEP 05] SPACEFIGHTERS
[MON 12 SEP 05] THE NORDEN & SPERRY BOMBSIGHTS (2)
[FRI 09 SEP 05] LIQUID TREASURE (1)
[THU 08 SEP 05] MEASUREMENT AT THE EXTREMES
[WED 07 SEP 05] THE CLOCKWORK ANDROIDS
[TUE 06 SEP 05] CRANBERRIES AS TECHNOLOGY
[MON 05 SEP 05] THE NORDEN & SPERRY BOMBSIGHTS (1)
[FRI 02 SEP 05] REAGAN'S WAR ON TERROR (5)
[THU 01 SEP 05] TIME SPAMMER

[FRI 30 SEP 05] LIQUID TREASURE (4)

* LIQUID TREASURE (4): The whole private-versus-public controversy over the supply of water tends to be a dull read, coming off as a matter for the bean-counters. The same cannot be said of one of the major issues in water management: dams.

In the first place, big dams are simply spectacular constructions, seeming monuments to humanity's control over nature, and an all but irresistible focus for major public projects. Before World War II, both the US and the Soviet Union engaged in huge dam projects as part of development efforts, and since that time many other countries have followed their example. China became very enthusiastic, and is currently the world leader in dams -- 22,000 in all. The runner-up, the US, only has 6,600. India is also enthusiastic, with 4,300 dams, and Japan has built 2,700 dams to control 90% of the country's rivers. China has been so wild about dams that in some years the lowest stretches of the Yellow River dry up completely. The country's enthusiasm has not been without its difficulties, since China has suffered some of the worst dam disasters in history, with a series of breakages in 1975 killing hundreds of thousands of people.

China is now beginning to fill the reservoir behind the Three Gorges dam on the Yangtze, which will be the world's biggest dam when it is completed in 2009. Three Gorges has been a magnet for protests by environmental activists, who despise big dams. As is often the case with activists, they have many legitimate objections: dams cause siltification upstream; they block the movement of migratory fish; and they displace natural habitats and also large populations of people. Relocations of citizens displaced by dams have at their worst looked something like classic punitive mass deportations. Dams can also be economically questionable, with the benefits of irrigation, power, and recreational use failing to offset the huge direct costs of their construction and the indirect costs of displacement of habitats and towns.

American environmental activists have a particular axe to grind against the Glen Canyon dam in Colorado and the Hetch Hetchy dam in California. Such activism is not restricted to the US by any means, either. The Sardar Sarovar dam in India was completed over loud protests by Indian citizens, backed up by court challenges, and even in China, where anti-government protests are, uh, not encouraged, there have been public objections against the Three Gorges dam.

* Despite all the fuss and fury, governments keep on building dams. This is partly because dams can have major practical benefits, though activists are extremely reluctant to admit it:

Furthermore, problems such as siltification can be addressed by changes in dam design and operating procedures. Migratory fish can be accommodated by fish ladders. Of course, the problem of relocating local populations is a matter of sensible management. The World Commission on Dams (WCD), which threw together greens, dam-builders, financiers, and public officials, released a report in November 2000 that provided 26 guidelines for dam projects, suggesting such items as consultation with locals to be affected by the dam, a proper relocation plan, and above all an honest, realistic cost-benefit analysis.

In other words, a case can be made that there is such a thing as a "good dam". This is not the same thing as saying that all dams are good. Dams, as noted, are spectacular, and politicians have a tendency to exploit them as a way to get the public's attention. Dams also involve a lot of money, which translates to pork contracts to spread around to powerful constituents. In some places, such floods of money also translate to official corruption, since it is much easier to discreetly skim off funds from a large project than it is a small one. Smaller, more appropriate water projects that make more sense may be ignored.

The result is that many dams have been built that weren't actually needed, or have been overpriced and oversold when they were. Some of them were even built in places where it was obvious they shouldn't have been. The infamous Teton dam in the US state of Idaho was built on ground known to be less than solid, and collapsed within weeks of its opening in 1975.

* Weighing the pros and cons, dams are still a pain. A World Bank official says that big dam projects are about 10% of the organization's portfolio, and about 95% of its headaches. World Bank funding for dam construction has, not surprisingly, fallen from about a billion USD a year about a decade ago to about $100 million USD a year now. The WCD guidelines have added to the pain as well, since even many World Bank officials believe that if they're strictly interpreted, they make building a dam completely impossible. What else might be expected of a document prepared by a committee?

The World Bank still wants to back dams that make sense and believe that the WCD guidelines, taken with a bit of salt, are valuable. Bank officials reply in some exasperation to the unceasing hostility of dam-bashing activists that if the World Bank doesn't get involved, countries will go ahead and build dams anyway, without any extra-national oversight, or any bother with even a soft version of the WCD guidelines.

The Three Gorges dam has no World Bank funding. The Chinese government has now approved a $50 billion USD water transfer project of massive scale that involves shifting entire rivers from the south of the country to the north. India is doing China one better, working on a massive project to connect all the nation's major rivers. There have been loud protests against such grand schemes, and even moderates wonder if they're not expensive answers to problems that could be answered with much simpler and cheaper solutions. [TO BE CONTINUED]

START | PREV | NEXT
BACK_TO_TOP

[THU 29 SEP 05] VOIP GATHERS MOMENTUM

* VOIP GATHERS MOMENTUM: As discussed in an article in THE ECONOMIST ("The Phone Call Is Dead; Long Live The Phone Call", 4 December 2004), it might seem a bit of a jump at first to realize that the internet turns out to have a close link to the telephone, but after the internet is all about digital data communications, and if conversations are digitized they become just another form of data to communicate. Since even the conventional telephone system includes a lot of digitized voice conversations, this doesn't turn out to be much of a jump at all.

Where the jump lands is on "Voice Over Internet Protocol (VOIP)", in which digitized telephone conversations are chopped up into data packets and passed over the internet. VOIP is a hot topic right now, and is likely to get hotter, the odds being that it will sooner or later replace the existing phone network. One benefit is that, since voice becomes simply another form of data under VOIP, it just becomes another form of email that can be copied, forwarded, and so on. In addition, a phone number becomes mobile, moving with the user. The real kicker is that VOIP means the end of rates. A subscriber signs up for a set fee, and pays no more no matter how many, far, or long the call. No more long distance charges. With the explosion in communications bandwidth, data transmission is now, within bounds, too cheap to meter.

The VOIP charge has been led by a pack of startups, most significantly Vonage. Even Vonage's network of hundreds of thousands of subscribers is still tiny compared to the vast network of conventional phones, but the company is growing rapidly. Interestingly, giant AT&T has jumped on the VOIP bandwagon, and feels assured of becoming one of the winners in the end. Cable service providers are now trying to get on board, seeing that VOIP nicely complements their video delivery and cable-modem services. Traditional telephone service providers like Qwest are starting to offer VOIP as well.

Some observers think the little guys like Vonage have no chance to survive against the big cable and telco operators. However, the big guys have their own set of weaknesses. A cable service provider only serves a particular area -- and so if users move, they have to pick up services again. The telcos actually hate VOIP, since they are the ones who will suffer badly because of the end of metering, and have offered it only reluctantly. They have been trying to restrain VOIP by lobbying government regulators, an obnoxious tactic that can only buy them some time at best.

The real giant killer is the new emerging wireless technologies, which will undermine the existing hardware infrastructure owned by the cable providers and the telcos. Portable VOIP handsets based on the current wi-fi spec should be available soon, and the high-bandwidth Wi-Max technology is just a few years down the road. Wi-Max will mean cheap sales packages offering fast mobile internet access with VOIP capability; the days of long-distance tolls are clearly numbered.

[ED: As of 2021, I have a neat solution to the phone service issue via Google Voice -- which gives me a VOIP number free of charge, that I can use on an internet connection. I hated having to pay for phone service; I didn't have to pay for email service, and I had more use for email. Another advantage of Voice was that phone spammers effectively went away, since I could monitor and block them easily.]

BACK_TO_TOP

[WED 28 SEP 05] STORM WARNING

* STORM WARNING: As discussed in an article in THE ECONOMIST "Storm Surge", 17 September 2005), over the summer of 2005, the US and Caribbean nations suffered through one of the worst hurricane seasons on record, with storms as strong or stronger than any seen before. The obvious question is: is this just a statistical glitch, or does global warming have something to do with it?

hurricane from orbit

The global weather system is such a complicated phenomenon that few of the knowledgeable would come right out and lay the blame on global warming, but few are willing rule it out, either. A recent paper from SCIENCEMAG.org by researcher Peter Weber of the Georgia Institute of Technology investigated the matter and came to some cautious conclusions.

It is generally agreed that hurricanes can form if the ocean surface temperature is above 26 degrees Celsius (79 degrees Fahrenheit). What meteorologists don't agree on is whether higher temperatures beyond that threshold make much of a difference. Higher temperatures might produce more, longer, or stronger storms -- but nobody knows for sure.

Since 1970, average ocean temperatures have risen about half a degree Celsius. Weber and his team went through weather-satellite imagery archives back to that year to obtain observations of the North Atlantic, West Pacific, East Pacific, Southwest Pacific, the North Indian Ocean, and the South Indian Ocean. (Satellite records didn't provide adequate coverage before 1970.) The observations showed that all of these regions showed a definite increase in average ocean surface temperature over the decades, except for the Southwest Pacific.

Analysis of the cyclonic storms themselves showed that the only place they have consistently increased in number and duration was in the North Atlantic. That finding tends to deflate the idea that a rise in ocean temperatures is the main, or at least direct, cause of more and longer-lasting storms. In fact, there was no significant increase in the maximum windspeeds of storms, even in the North Atlantic. The report would, however, have been flying in the face of public perception to say that nothing much seemed to be amiss, and in fact one of the conclusions was that the number of the worst storms, in category 4 and 5, has doubled -- everywhere, even in the Southwest Pacific, where ocean temperatures haven't risen. Something is clearly happening. Exactly what remains an unsettling mystery.

BACK_TO_TOP

[TUE 27 SEP 05] CHERNOBYL GHOST TOWN

* CHERNOBYL GHOST TOWN: As discussed in an article from SCIENCEMAG.org ("A Radioactive Ghost Town's Improbable New Life" by Richard Stone, 20 May 2005), in April 1986, the town of Pripyat in the Ukraine was putting the finishing touches on a pair of new public attractions in the city center: a ferris wheel and a bumper-car ride. They were supposed to be open to the public on May Day, but a "mayday" of a different form took place on 26 April, when reactor number four of the Chernobyl nuclear complex, visible from the town, exploded, spewing radioactive isotopes into the air. Pripyat's 50,000 inhabitants were quickly evacuated, though they expected to return after the emergency was contained.

Pripyat amusement park

They never came back, and 19 years later the ferris wheel sits rusting in the center of the abandoned town, having never seen its opening day. However, Pripyat is now a subject of some scientific interest. In an era of a global war on terror that affects both East and West, it provides a useful model of what might happen to a town if it were attacked with a "dirty bomb" -- a conventional explosive weapon packed with radioactive isotopes.

A town named Slavutych was built in the vicinity to replace Pripyat, with an International Radiological Laboratory (IRL) established there to support research on the effects of the accident. Two years ago, a team of American and Russian researchers operating out of the IRL began to sample radioactivity in Pripyat and in other parts of the region around the Chernobyl site. The area worst hit by the fallout from reactor number 4 was due west of the reactor complex, and is called the "Red Forest" because of the color of all the dead trees in the area; everything there died. Pripyat got off lucky, since it was to the north and out of the direct path of the plume. If the plume had fallen directly on the town, thousands of citizens would have died over the following few years.

The US Defense Threat Reduction Agency (DTRA) plans to use the research as the basis for a new study on the possible effects of a dirty bomb attack. Says meteorologist John Pace of the DTRA: "We can't directly simulate this kind of attack, so we use various means of obtaining representative data." Although Pace admits that there are "huge differences" between the Chernobyl accident and a dirty bomb attack, he maintains that the Pripyat site can still provide very useful data. The place is still "hot": the moss growing in the cracks in the streets drives Geiger counters wild. It is also much more structurally diverse than any dummy city might be, with buildings of up to 16 floors; it will be useful to determine if and how radioactivity changes from ground level.

Says radioecologist Ronald Chesser of Texas Tech University, who was a prime mover behind the earlier surveys of the town: "Pripyat is not a mockup. It is not a sterile facade of buildings erected for the purpose of blasting particles through its empty spaces. Bicycles, pianos, libraries, and baby dolls decaying through 19 winters are there to remind us that learning from this event really matters."

BACK_TO_TOP

[MON 26 SEP 05] INTERNET-SCALE OPERATING SYSTEM (2)

* INTERNET-SCALE OPERATING SYSTEM (2): Internet distributed applications are sexy, but they give the feeling that they're only scratching the surface of what can be done. The next logical step is to build something with more general application and greater capability. That next step is the "internet-scale operating system (ISOS)" of the future, which could provide the facilities to manage any sensible distributed application, and also ensure the economic success of distributed schemes by implementing mechanisms where users would be paid for their contributions.

Implementation of an ISOS would of course be far from trivial. The nodes on the internet use a wide variety of processors and operating systems, have different amounts of disk and memory resources, and are linked to the internet over connections whose performance varies over a wide range. Nodes may disappear at intervals, sometimes not coming back, while new nodes appear. The ISOS must of course be no more intrusive to users at each node than necessary, and also abide by any restrictions imposed by users, such as operating only at certain times of the day or blocking certain uses. On the other side of the coin, the ISOS must also ensure that users follow the rules, preventing them from using the system to perform frauds or malicious sabotage.

The ISOS must implement an overall strategy to allocate the resources that it makes use of, and also ensure appropriate payment for resource use. Interestingly, it is possible to have a single strategy that addresses both problems at once, using a "free-market" economic model. For example, the Mojo Nation distributed application uses a "token currency" named, unsurprisingly, "mojo" to control its operation. Any node contributing its resources earns a little mojo, while any node that uses resources has to pay mojo. The "invisible hand" that guides market economies with remarkable smoothness works just as well for Mojo Nation, helping ensure a surprisingly optimal allocation of resources.

An ISOS would support commercial sale and purchase of its distributed resources, keeping track of resource usage; the specific arrangements made by buyers and sellers for payment; and what the going rate for those resources should be. It would also make sure that buyers receive what they pay for, and no more is taken from sellers than they are paid for. In essence, an ISOS would incorporate elements of online financial transaction systems and a bank.

* The ISOS doesn't really exist just yet, but a vision is emerging. The ISOS would consist of software, an "ISOS agent", operating on all the user nodes in the network, interacting with a central controlling system that runs on a set of "ISOS servers".

The ISOS agent would be implemented in a stripped-down "microkernel" that provides only fundamental "core" functions, while the higher-level functions would be implemented in a set of programs making use of the microkernel. An ISOS agent would permit execution of a distributed application on its node. It would allocate resources on its node and schedule the use of those resources, as well as handle communications to other nodes, and keep track of resource use to ensure proper payment for a service. The agent would not perform functions better executed by the node's host operating system.

The ISOS server complex could be run by a government-funded organization, or a consortium of interested providers. A relatively small number of servers would be adequate to support the entire ISOS and still provide enough redundancy to keep the system in continuous, reliable operation. The server complex would maintain a database to track the state of the system. For each node in the ISOS, the database would include:

Node users would contact the server complex, probably through a website, to obtain the agent software, install it on their node, and then set up their account with the ISOS. Once in operation, the ISOS agent would contact the ISOS server complex to obtain a list of tasks to execute.

Buyers using the ISOS resources would submit task requests to the ISOS server complex, indicating what resources are required and, if necessary, providing specific application programs. The server complex would then parcel out the tasks to the nodes and track task progress, switching task elements to new nodes as others become unavailable. The ISOS would also include a software toolkit to allow programmers to build their own ISOS applications.

At present, distributed internet applications are in their infancy. It will take some time not merely to develop and implement a workable ISOS, but also for its use to catch on. It is likely that once it does, the ISOS will become yet another marvelous new technology that everyone ends up taking for granted. [END OF SERIES]

[ED: As of 2021, the ISOS didn't happen. It appears that computation via mass networks of personal computers was really a niche thing.]

PREV
BACK_TO_TOP

[FRI 23 SEP 05] LIQUID TREASURE (3)

* LIQUID TREASURE (3): Despite the difficulties, the privatization of English and Welsh water utilities has been judged successful. Not everyone else is enthusiastic about jumping on the privatization bandwagon, and most who have done some privatization prefer the French model, with the ownership of waterworks retained by the state.

German water utilities, except in Berlin, remain in public hands -- not surprising for a nation so fond of regulation that foreign businessmen trying to work there claim the attitude is that everything is forbidden unless specifically allowed. In Spain, only about half the water system remains in public hands, the biggest private player being Aguas de Barcelona, another arm of Suez. American cities have flirted with privatization, with the city of Atlanta awarding a contract to Suez and then canceling it.

The experience of privatization in the rich countries show it is a practical option there, at least as workable as public utilities or more so, though getting from here to there can be troublesome. Boosters of privatization believe that the place where private water suppliers can make a real difference is in underdeveloped countries.

Unfortunately, the problems of getting from here to there are even worse in such places. Veolia has done well enough in the "emerging free states" of Eastern Europe, whose stars are more or less rising rapidly, but the troubles of Suez, Veolia, RWE / Thames, and Saur (another French firm, the world's fourth-biggest water supplier, part of the Bouygues construction operation) have made them cautious. All jumped into the "emerging markets" and all found risks greater and rewards smaller than expected. Industry observers say that if any international water firm announced that it was, so to speak, washing its hands of emerging markets, its stock value would promptly rise.

For example, the World Bank had lent money to the public water utility for Manila in the Philippines, only to find out that the funds tended to leak away without any improvement in the city's water supply. In 1997, the World Bank leaned on the Philippine government to let out a contract to a Suez subsidiary. It turned out to be a rocky arrangement and was terminated in late 2002, with quarrels over pricing and the company complaining about difficulties in dealing with the municipality.

The World Bank also pushed Latin American countries to privatize. Buenos Aries seemed like a real success story for the concept for a while. The municipal water concession was granted to Aguas Argentina, in which Suez is a majority stockholder. Service quality improved, services were extended to millions more poor customers, and rates dropped. Then, in 2002, the Argentine economic crisis made itself felt. Aguas Argentina had a clause in its contract that allowed a rate increase if there was a large currency devaluation. The government refused to go along, the whole thing went to hell, and at last notice all that was going on was litigation.

In Bolivia, three cities -- La Paz, Santa Cruz, and Cochabamba -- undertook initiatives to improve their water supplies. La Paz turned to a Suez subsidiary, an exercise that turned out well enough. Santa Cruz decided to use World Bank loans to improve their public water system, and it appears this was a successful approach as well. Cochabamba's mayor got ambitious, planning to build a dam and a tunnel. The World Bank judged him too ambitious and refused to back the effort. The Bolivian government pushed on anyway, awarding a contract to the giant US Bechtel construction firm. Water rates were raised to provide the money, leading to public demonstrations in 2000 in which one citizen was shot and killed. This effort also fell apart and ended up in litigation.

* The Cochabamba fiasco has been a convenient target for activists who oppose privatization. Local activists are proud to announce that water rates have gone back to normal in the city. The only problem is that the water system has also remained locked in the status quo, meaning only 60% of the city's residents have reasonable access to water.

Advocates of privatization are quick to point this out, and also add that the Cochabamba plan was half-baked, as the World Bank saw from the start. Rates would have had to rise in any case to finance improvements to the system, but overblown ambitions put them in the region where they were publicly unacceptable. The real lesson of Cochabamba, the advocates say, is that privatization will not work effectively if the local government doesn't know how to manage the process, which partly implies a reasonably sensible regulatory environment, with consistent rules and implementation. In fact, advocates say that the term "privatization" is unrealistic. In practice, as the examples of France and Britain show, it's a case of "public-private collaboration".

The water companies and international development organizations are now piecing together what they hope are reasonable strategies that local governments can use to work with the private sector to improve water systems, implementing sensible funding mechanisms and having contingencies for accidents of fate, such as currency devaluations.

The simple truth of the matter is that municipalities just plain need the private help. In Delhi, the capital of India, the municipal water organization proudly boasts that they provide service to 86% of the city's population, with usage levels comparable to wealthier countries. What remains unsaid is that much of the water is wasted by a broken-down distribution system, and the poorer parts of the city only get water for short periods each day. Prices are held ridiculously low by the politicians and metering is ineffectual, so the funds aren't available to change matters even if there was the will.

There is no reason to believe that all public water utilities are so inefficient, but in an environment where there's so much room for improvement, it seems pigheaded to oppose the assistance of the private sector. Even where public systems remain in control, the competition from the private sector, or even the threat of it, can help spur the public systems to better efforts. [TO BE CONTINUED]

START | PREV | NEXT
BACK_TO_TOP

[THU 22 SEP 05] PRODUCT LIFECYCLE MANAGEMENT

* PRODUCT LIFECYCLE MANAGEMENT: As discussed in an article in THE ECONOMIST ("Better By Design", 17 September 2005), anyone who's ever worked in a manufacturing environment knows perfectly well that building a product is not a case of simply throwing together the parts, slapping the final product into a box, and then shipping it. The process requires an immense amount of nitpicking work and logistics that traces back into product development, including design documents such as blueprints, specifications and lists of parts, and manufacturing procedures; through production, with the parts obtained, stockpiled, and distributed for manufacture; and into customer service, with replacement parts stockpiled and made available when needed, generally long after production has ceased.

This is a very complicated process, and it is easy for things to go wrong. For example, the accident that crippled Apollo 13, the 1970 US manned Moon mission that almost didn't make it back to Earth, was due to a failure to update a single simple component, a relay, when system requirements changed during development. Software can help avoid such glitches, and in fact "product lifecycle management (PLM)" software is a big business these days, with a sales volume of about $10 billion USD in 2004.

For example, consider a company building a personal computer. PLM software can allow the design engineers to leverage off of earlier designs already in the PLM system's database, reusing old parts and validating their compatibility with the new design. The PLM system is then leveraged into manufacturing and service. The end result is cheaper and faster development, as well as improved product quality.

The roots of PLM software go back to the computer-aided design (CAD) packages that became established in the 1980s, which allowed engineers to design products using computers, with the software providing visualizations and documentation for parts. In the 1990s, CAD systems were linked to component databases to become "product data management (PDM)" systems. PDM software allowed all the data about a product and its components to be placed in a single, easily accessible database, a much more convenient scheme than the archives of paper files that were the norm for decades. PLM adds more features to PDM, such as tools to allow managers to get a high-level view of their product lines, or to permit packaging engineers to design a proper packaging system for a product, or to allow marketing staff to show an unbuilt product to customer focus groups.

One factor that enhances the value of PLM is globalization. Products have to be tweaked in various ways for regional markets to take into consideration local tastes and regulations, a task that PLM makes easier, and companies with development teams spread around the world have to integrate the efforts of far-flung engineers, with PLM providing useful to keep everyone in marching order. British jet engine manufacturer Rolls-Royce, for example, uses PLM to coordinate design teams in Britain, the US, and India for developing the Trent 900 turbofan for the Airbus A380 super jumbo jet.

Another factor is the general trend for more complicated products, which would have painfully strained the old paper-based product management systems. Dassault Aviation of France claims that the company's new Falcon 7X business jet was the first aircraft to be designed completely in a virtual environment. Dassault's PLM system linked the company's engineers with their counterparts at 27 subcontractors; it provided detailed modeling not merely of all the parts in the aircraft, but of the operation of robots used to build the machine and of maintenance procedures used to keep it flying. There were minimal assembly problems with the first aircraft.

In addition, PLM can be used to track materials in components that are subject to regulatory control. New regulations for heavy metals that are often used in electronic components are scheduled to go into effect in 2006, and a PLM system can be used to track components subject to these regulations. Similarly, packaging regulations, particularly for pharmaceuticals, can vary from country to country, and a PLM system can be used to help keep everyone involved up-to-date on the proper packaging configuration.

In 2000, giant General Motors (GM) was straining to keep up with international product development and marketing, and so the company bought a PLM system. Now about 10% of GM uses the system to permit leverage of parts between vehicles; pass changes in part design onto tool and die fabrication; perform virtual "crash tests" of new designs; and coordinate design, fabrication, and delivery of subassemblies from subcontractors, with parts usually proving to fit perfectly on delivery. Product development cycles dropped from 48 months in 1997 to about 18 months today. There was some difficulty in implementing the system, since many of the development functions that had been compartmentalized in the old days were now integrated, with designers forced to deal with continuous inputs from marketing.

PLM had its roots in the big aerospace and automotive manufacturers, but now it is spreading downward to smaller firms, particularly as small subcontractors increasingly find themselves required to interface with the PLM systems of bigger firms. PLM is becoming part of the engineering curriculum at the top engineering universities. Eventually PLM promises to become so widespread that it will cease to exist as a single system: instead, manufacturers will have software packages that cover all the bases for product development, manufacture, marketing, sales, and service, with these packages neatly networked with each other. People will then forget that there was ever a time that things were different.

BACK_TO_TOP

[WED 21 SEP 05] BIOFUELS ON A ROLL

* BIOFUELS ON A ROLL: As discussed in an article in THE ECONOMIST ("Stirrings In The Corn Fields", 14 May 2005), the idea of "biofuels" -- automotive fuels derived from crop plants -- has been long mocked as a fantasy of over-enthusiastic Greens on one hand and as an energy-inefficient front for farm subsidies on the other. However, the reality is that biofuels appear to be going from strength to strength.

In the US, where maize is the primary raw material for biofuels, biofuel production is growing at 30% a year. In Brazil -- the world leader in biofuel production, using sugar as the primary raw material -- is charging ahead even faster. China has built the biggest ethanol plant in the world and plans to build another like it, while Germany plans to increase production of "biodiesel" by up to 50% a year, and France intends to triple production of ethanol and biodiesel by 2007.

Despite the growth, biofuels are still a small-time player in the energy market, but now they are beginning to be taken more seriously. The reason is obvious: soaring costs of petroleum. Given that biofuels are generally subsidized, they are now actually cheaper than petroleum, even factoring in the reality that they have less energy per liter than gasoline. In places like the US Midwest, America's breadbasket, they are almost as cheap even without a subsidy, and Brazilian ethanol is well cheaper on its own merits.

* Biofuels are not exactly a new idea. Rudolf Diesel demonstrated an internal combustion engine running on peanut oil in 1900, Henry Ford was an advocate of crop-based ethanol in the 1920s, and biodiesel has been used here and there since the 1930s. Biodiesel is easy to synthesize from animal or plant oils, or sugar or grain; in fact, a diesel engine will run for a time on supermarket cooking oil, at least until the engine filters clog up.

The Brazilians were the first to seriously exploit biofuels. Brazil grows lots of sugar cane, and when petroleum prices shot up in the early 1970s, the country went to ethanol in a big way, even building cars that could burn pure ethanol. Then fuel costs went down and sugar prices went up, making sugar a more profitable product, and by 1990 the Brazilian ethanol system had faded.

Biofuels didn't disappear completely, however, and with the rise in oil prices, not only has Brazilian ethanol bounced back, in fact mixed fuels are now common over the world. Europeans with diesels usually fill up on "B5" -- diesel with 5% biodiesel, usually made from rapeseed (canola) -- without thinking twice about it, and many Americans pump out "E10", gasoline with 10% ethanol, AKA "gasohol", with a similar lack of concern. Biofuels are even more important in specific niches and markets. Some American and Canadian public-sector vehicles tank up with B20, and Californians use straight, 100% biodiesel. With additives to keep it from freezing up on cold wintry days, straight biodiesel is also sold in Germany and Austria.

Pure ethanol isn't good for normal automobiles, since it attacks gaskets and hoses that haven't been designed for it, but in 2003 Brazilian carmakers introduced "flex-fuel" automobiles that can run on pure gasoline, pure ethanol, or any mix in between those extremes; Brazilians typically burn E25 to E75. About a third of the cars sold in Brazil now have flex-fuel engines. Flex-fuel cars are being sold in the US as well, with about four million on the roads, burning mixes from E70 to E85, though pumps are always labeled "E85". The new gaskets, hoses, and other fixes to permit burning pure ethanol are not expensive, and the sticker price for a flex-fuel car is not much higher than that for a standard auto.

Not surprisingly, petroleum companies have not been happy about biofuels, proving reluctant to supply them at filling stations, but they are under pressure to get happier. Environmentalists like ethanol because a mix of ethanol and gasoline, or of ethanol with an additive named ETBE and gasoline, burns cleaner than gasoline, and so it is mandated in some places. Farmers are of course very excited about the idea of selling crops for biofuels, with over a tenth of America's maize crop going into biofuels in 2004. The farmers have lobbied a few American states to mandate use of biofuels, and there are also bills to that effect before the US Congress, targeting ethanol production of 30.3 billion liters (8 billion US gallons) of ethanol in the US by 2012. This is about twice current production, though still only 4.6% of total American fuel consumption. The US has 84 ethanol plants -- some organized as farm cooperatives -- with 16 more in the pipeline, and there is considerable room for expansion.

American biodiesel production is very small at present, only about 0.1% of the total market for diesel in the US, but it is expected to rise, with new environmental regulations enhancing its prospects. Diesel automobiles are much more common in Europe, and so is biodiesel, with sales promoted by tax breaks from the agreeable governments of Germany, Italy, France, Spain, and Britain. German biodiesel production, according to the producers, has tripled since 2002, now accounting for 4% of the diesel sold in Germany and 2% of all automotive fuels sold there. French biodiesel production is scaling up rapidly, and Britain, though a laggard, is starting to ramp up as well, particularly since supermarket / filling station giant Tesco is throwing its corporate weight behind biofuels. A large biodiesel plant is being built in Norway by Fortum Oil, providing a bit of evidence that the petroleum producers are starting to find biofuels intriguing instead of annoying.

European ethanol production lags biodiesel, but it is catching up, particularly in France and Italy, where it is being prodded by tax breaks. Although rapeseed and soya useful for biodiesel production are common crops in Germany, they're not in France and Italy, where crops like sugar beet, grain, and grapes that are useful for ethanol are popular.

That of course suggests the belief that biofuels are largely crop subsidies has some basis in fact, and indeed without the tax breaks there's no way European biofuel production is competitive with petroleum unless the price of oil per barrel shoots up substantially further. Brazil could sell its ethanol in Europe at a profit without subsidies, but the farm lobbies that have helped push biofuels in both Europe and America have also pushed for tariffs on foreign biofuels to protect the home-grown stuff.

The economic health of industries propped up by government subsidies is justly suspect, but biofuel production technology is in its infancy and there's a lot of room for growth, meaning that biofuel may be able to throw away its government-provided crutches not too many years from now. Efforts are being made in Europe and Canada to begin production of biofuels from straw, wood, and other relatively cheap feedstocks. Biofuel enthusiasts claim that within a few decades biofuels will cost only half as much as petroleum, even without subsidies, and one report suggested that by 2050 half of America's fuel needs could be provided by biofuels. Those notions might be hallucinations, and if oil prices fall sharply again biofuels may suffer for it. However, the days when biofuels weren't taken seriously are now clearly over.

[ED: A recent study insisted that biofuels did require more energy to produce than they yielded, but other studies were invoked in response to prove the reverse. If subsidies are phased out and biofuels become competitive with petroleum, the argument will disappear, since then if it cost more energy to make biofuels than they yielded, they would quickly stop being anything like a going proposition.

I think that sooner or later biofuels will be produced on an economical basis. One question I have is how much land would have to be set aside to provide enough fuel to replace oil; of course, this calculation is strongly dependent on variables such as the possibility of "multiple use" crops and on the efficiency of processing methods, with genetic engineering of both feedstock plants and the proper microbial production "tools" playing a factor in the efficiency calculation. However, it would still be worthwhile to know if the amount of land would be unrealistic.]

[ED: As of 2021, biofuels are stagnant. Falling costs of oil due to fracking meant they were economically impractical. Whether they will continue to be impractical remains to be seen.]

BACK_TO_TOP

[TUE 20 SEP 05] WRISTWATCH OF TOMORROW

* WRISTWATCH OF TOMORROW: As discussed in an article in THE ECONOMIST ("Watch This Space", 17 September 2005), the idea of the high-tech watch has been around for a long time, going back at least to comic-strip detective Dick Tracy's two-way wrist TV. In practice, the high-tech watch has mostly been an illusion. Calculator wristwatches have been around for a long time, but they've hardly taken the world by storm, since they're just too inconvenient to use. The Swiss Swatch Group built a prototype "wrist phone" in 1998, and Samsung of South Korea unveiled a similar gadget in 2003, but nobody was interested; it was just so much more convenient to use a cellphone. Attempts to turn watches into digital music players didn't work much better, a pendant being a much less clumsy design. To be sure, gimmicky watches are available, such as one with a Geiger counter and a Japanese TV watch -- with a battery life of all of an hour. However, all they really are is gimmicks. Is there something useful that a watch can do other than tell the time?

Watchmakers think so. Last year, a number of manufacturers introduced new "smart" watches that could pick up messages from Microsoft's MSN Direct service, which broadcasts news headlines, sports flashes, and stock prices over an FM radio channel. MSN Direct is only available in certain regions of North America, but if it proves successful it is likely to spread elsewhere. Four watch manufacturers back the scheme: Suunto in Finland; Fossil of Texas; plus Tissor and Swatch, both members of the Swiss Swatch Group.

The Swatch Group is a key player, since its members account for 75% of global watch sales by revenue. At present, the smart watches are expensive, targeted at a professional audience. They are fairly hefty watches, and the yearly subscription fee required to access MSN Direct is an obstacle. However, even if this scheme doesn't pan out, there are others in the wings. An Austrian firm known as LAKS, for its boss, Lucas Alexander Karl Scheybal, has integrated a flash ROM chip into a watch, allowing it to be used for portable mass storage. A USB connector is fitted into its strap. The idea has proven successful, doubling LAKS sales.

LAKS is pushing the idea farther, integrating "smart card" electronics and a contactless interface link for secure transactions along with the flash ROM. Tickets or a cash account could be stored in the watch; it might also prove a nice place to store charge card numbers, medical information, and other useful data. Swatch is pursuing similar ideas, and in fact in July 2005, 64,000 Swatch "Access" watches were used as tickets for the opening events of the new Swiss national football stadium. Maybe, one of these days, the high-tech watch is finally going to catch on.

BACK_TO_TOP

[MON 19 SEP 05] INTERNET-SCALE OPERATING SYSTEM (1)

* INTERNET-SCALE OPERATING SYSTEM (1): As discussed in an article from a while back in SCIENTIFIC AMERICAN ( "The Worldwide Computer" by David P. Anderson and John Kubiatowicz, March 2002), although an internet user may curse the spam, the hype, the bugs, the trashy websites, and the delays, under the surface anyone who's stayed with the internet still retains a seed of excitement with the whole idea. The idea of being able to communicate and access information all over the world in a more-or-less immediate fashion is big-time stuff. Anybody who is too jaded to appreciate it might think of going back in time a quarter-century and wonder what life would be like without it.

What is particularly intriguing is that we've yet to really grasp all of what we can do with the internet. One unexpected idea is the notion that the internet really constitutes one great big distributed computer, with well over a hundred million nodes at present. This is such a megalomaniac idea that it is a little difficult to take seriously, but in fact a number of "internet distributed computing" applications have been invented, implemented, and put to more or less good use:

These four internet distributed computing applications use a similar approach. The user downloads screen saver software that implements the application and installs it. When the computer is otherwise idle, it performs calculations on a parcel of the problem, and reports the results back to a central server. This scheme is well suited to dealing with problems that require a lot of calculation that can be subdivided into separate and generally independent tasks that demand little communication. For example, "distributed.net" simply tells each computer to search through a designated range of keys. All each computer needs to do at the end of the search is report either a match, or a failure to match. The central server parcels out the tasks and consolidates the results.

Another aspect of the internet considered as one big computer is to see it as a distributed file system. The now-infamous "Napster" site essentially took this point of view, providing a central database where one user could look up the location of a music file on another computer on the network, and then download the music from that computer. Napster ran foul of copyright laws playing this game, but the idea is far from dead. "Gnutella", for example, provides a distributed, private, shared file system among its users, as do "Freenet" and "Mojo Nation". Another angle on distributed internet computing is "Fasttrack", which parcels out internet search tasks among its nodes. [TO BE CONTINUED]

NEXT
BACK_TO_TOP

[FRI 16 SEP 05] LIQUID TREASURE (2)

* LIQUID TREASURE (2): At the present time, 90% of the world's fresh water is delivered by public water utilities of one sort or another. There are those who believe that private firms, unburdened by government bureaucracy, could do a better job.

Private firms have long been involved in water supply. Many public water utilities in the US and other Western countries began life as private firms. In fact, in France, despite a national tendency to sail to the left and look down on the American idolization of the private sector, water has been supplied by private firms for 150 years. The two biggest water supply companies in France are Generale de Eaux, with 25 million customers, and Lyonnaise des Eaux, with 14 million. Generale de Eaux was established by Napoleon III in 1852, and Lyonnaise des Eaux was established in 1880. Oddly, Lyonnaise des Eaux has never provided water to Lyons -- in fact, Lyons was General de Eaux's first supply district -- the name being only derived from the financial backers of Lyonnaise de Eaux, the Credit Lyonnaise financial group.

Of course, these water supply operations are not completely private. The state will always have a major say in how the public gets water. France uses a scheme known as "affermage", in which the water system is actually owned by the municipality, but all the details of its operation are contracted out to the company.

The reach of the two companies extends far beyond France, since Generale de Eaux is merely the local arm of the international Veolia concern, and Lyonnaise de Eaux is similarly the local arm of Suez, the world's biggest water firm. The US has little clout in this field, with one major attempt to compete, the Azurix company, failing in the shadow of its parent, the sickly giant Enron energy company. In fact, the biggest private water firm in the US is, at the present time, US Filter -- which is now a branch of Veolia.

* Such long-standing companies are somewhat less interesting as examples than British water firms, which went from public to private in the recent past and provide examples on how privatization might work out elsewhere. In 1989, British Prime Minister Margaret Thatcher pushed through a plan to privatize all ten English and Welsh water utilities.

Backers of water privatization point out that these water firms get high grades for quality of service. The Scots kept their water system in public hands, and though rates were lower in Scotland for a while, service quality has fallen behind that found to the south, and rates are now higher than they are in England.

Financial performance has been an admitted difficulty for the privatized British water firms. When they were cut loose, the government gave them a boost by forgiving their debts and allowing them to hike up rates, which made them stock market darlings for a time. By the mid-1990s, however, the government -- in the form of the OfWat organization -- was tightening up the rules, and some of the utilities felt the pinch. The government discouraged mergers, leading some of the utilities to sell out to business in other sectors, such as electrical utilities, or to foreign firms. The biggest British water utility, Thames, which has an international presence that ranks third behind Suez and Veolia, sold out to RWE of Germany.

Of course, Germans don't actually own London's water mains. The government still owns the water, though unlike in France the British government doesn't own the distribution system, and exercises tight regulatory control over the private utilities, which are licensed for a term of 25 years. The government's influence on the water business is of course not always for the good. In 1997, Tony Blair's Labour government made water metering voluntary, with users able to choose between a metered or flat rate. Most preferred flat rate, a scheme that leads to inefficient use of water. [TO BE CONTINUED]

START | PREV | NEXT
BACK_TO_TOP

[THU 15 SEP 05] HYBRIDS ARRIVED

* HYBRIDS ARRIVED: As discussed in THE ECONOMIST ("Why The Future Is Hybrid", 4 December 2004), only a few years ago the hybrid automobile, with a gasoline engine driving an electric motor, was an experimental notion: something people talked about it and tinkered with it, but it was still a technology of the future. In 1997, Toyota took a leap into the unknown and released the first large-production gas-electric automobile, the Prius. The company did not realize how fast the future would arrive.

Since that time, only about a quarter million of these vehicles have been sold, not a large number by the standards of major automotive firms like Toyota, but the Prius is one of the hottest buzzes in the market. It's not particularly stylish, looking something like a melted-down wedge. However, it only uses half as much gas as a comparable conventional automobile, and so releases only half as much carbon dioxide, with even lower levels of other emissions. The Prius has become a "green" statement and there's currently a six-month wait to get one. Toyota is now raising production levels, though interestingly even at the new numbers, the levels of Prius production will only be a quarter those of the popular Toyota Camry.

Toyota Prius

Given a seller's market, however, the profit margins are pleasantly high, and other automobile manufacturers are trying to jump on the hybrid wagon. Ford is introducing the hybrid Escape and Honda has a hybrid Accord; DaimlerChrysler is working on a hybrid Mercedes; and GM, officially committed to the fuel cell car as the wave of the future, is hedging bets by introducing a range of hybrid vehicles. One of the interesting features of the GM hybrid pickups is that they leverage off the generator system to include power sockets for power tools or whatever, a handy feature for people working at remote sites.

* The idea of the hybrid isn't really new. There were tinkerings with the technology at the dawn of the automobile industry, and of course the basic principle is widely embodied in the form of the diesel-electric locomotive. Interest in hybrid cars began to revive with the energy crisis of the 1970s, though it would take time to pick up momentum. All the Big Three US car makers demonstrated prototype hybrids in the mid-1990s, but they were then sidetracked by the drive for all-electric cars, pushed into it despite their great reluctance by overly enthusiastic California bureaucrats who mandated an all-electric future for the state. California was trying to legislate in defiance of the laws of physics, or at least chemistry. Pure battery power couldn't deliver adequate range or reasonable cost, and the US electric car effort was a bust, pretty much as the auto manufacturers had expected.

In the early 1990s, Toyota was considering development of a more environmentally-friendly car. Japan is very dependent on imported energy and has serious air pollution problems, and in this case at least the government bureaucracy was more flexible than its California counterpart. Company engineers realized that a hybrid would be the best short-term solution, capable of doubling gas mileage, and that the technology was available to make it happen.

The Prius was introduced in Japan in 1997. Honda followed with its Insight in 1999. The Prius was introduced in the US in 2000. At first, neither the Prius nor the Insight made much of a splash, but in 2003 Toyota introduced a major redesign, just about the time fuel prices started to rise dramatically and endorsements from environmentally-conscious celebrities began to reach a critical mass. Now Toyota hybrids outsell Honda hybrids two-to-one in the US.

The Prius has had better press and is also more advanced than the Honda offerings. Not all hybrids are made equal, and in fact the term "hybrid" has suffered from some infection by marketing hype. At the bottom of the heap is the "micro" hybrid, which really isn't a hybrid at all, being better described by its other name, the "stop-start" car. It simply has a fast-starting engine that turns on instantly when the driver steps on the gas and turns off when the driver releases the gas pedal. It can save about 10% in gas mileage, being more beneficial in start-stop city traffic, and reduces air pollution from automobiles uselessly idling at the stop light.

The next level is the "mild" hybrid, which fits the Honda designs. It features start-stop operation and an electric motor-generator linked to the gasoline engine, with the motor-generator boosting engine acceleration and obtaining power from "regenerative braking". It is actually a fairly effective scheme, requiring a fairly modest amount of additional hardware and increasing gas mileage by over a third.

The Prius is a "full" hybrid, and substantially more sophisticated. It features a gasoline engine, an electric motor and generator system, a large battery pack, and a sophisticated power-control system. At low speeds and start-stop city traffic, the Prius operates as an electric car, running only on battery power. At higher speeds, the gasoline engine kicks in, driving both the wheels and the electric generator to recharge the battery. For quick acceleration, the electric motor helps the gasoline motor bring the car up to speed. The generator will perform regenerative braking to help recharge the batteries. The fact that the Prius operates as an electric car in slow traffic is what gives it its superlative gas mileage.

The next step is the "plug-in" hybrid, which at first glance seems like the older discredited electric cars. Actually, a plug-in hybrid isn't all that different from the Prius system, the main changes being that it has a bigger battery pack and can be recharged from a power socket in a garage. For short trips around town, it wouldn't use any gasoline at all, only calling in the gasoline engine for long-range trips.

Hybrids aren't very popular in Europe, where diesel technology is all the rage. Diesel fuel contains more energy than gasoline, and modern diesel engines aren't the smoky and noisy contraptions that their ancestors were. This leads to the question of: why not a diesel-electric hybrid? In fact, European car manufacturers are working on such a thing. One challenge is that even improved diesels are still not as clean as gasoline engines, but efforts are being made to design filter systems to scavenge up pollutants that still get through the engine. The other challenge is that a diesel engine imposes a cost premium on the sticker price of a car; add to this the premium imposed by a hybrid system, and sticker shock sets in.

The fuel cell is seen as a possible ultimate solution, allowing the internal combustion engine to be thrown out entirely. A hydrogen-powered fuel-cell electric car would be almost entirely nonpolluting, but nobody expects it to happen any time soon. Fuel cells still need work, and the idea of powering them with straight hydrogen sounds much tidier on paper than it is in practice. Right now, the hybrid automobile is the real game in town, providing energy efficiency without dictating any massive change in infrastructure. When the next generation of practical automotive technology comes down the road, it will almost certainly leverage off the steps taken by the current generation of hybrids.

BACK_TO_TOP

[WED 14 SEP 05] THE OVERLAND TRAIN

* THE OVERLAND TRAIN: As discussed some years back in an article in the magazine INVENTION & TECHNOLOGY ("Big Wheels" by Charles W. Eberling, Winter 2001), America in the two decades after the end of World War II was a different place than it is now. The US ended the conflict stronger than ever while all rival nations had almost bled to death, and many Americans could feel without any self-consciousness that their nation was a "colossus astride the globe". They thought in big terms and took on projects of unbelievable scale. Some, like the Apollo manned Moon program, they astoundingly pulled off. Not too surprisingly, many others didn't work out so well.

As a case in point, consider the US Army's "Overland Train Mark II", almost without doubt the biggest off-road vehicle ever built. It consisted of an oversized three-axle / six wheel control car, linked to twelve four-wheel cars in a train. It was twice as wide as an ordinary highway freight hauler rig; had a length of 175 meters (572 feet) and 54 tires, each 3 meters tall and 1.2 meters wide (10 x 4 feet). The control car did not pull the cars: they were all self-powered, with each wheel driven by its own electric motor. The maximum speed of the entire rig was 32 KPH (20 MPH). It looked like something Luke Skywalker might have encountered in the deserts of the planet Tatooine.

The Overland Train was built by the Robert G. LeTourneau Company of Longview, Texas. LeTourneau began his career as an earth moving contractor, working on large civil engineering projects. In 1924, he went into the business of building his own earth moving machinery. His business turned into an international success. At first, his earth moving machines used steel wheels on the job, with interchangeable rubber wheels for highway transit. More or less by accident, LeTourneau discovered that the rubber wheels were actually more effective all around. He then started to think about fitting electric motors in each wheel.

World War II came along, and the LeTourneau company went into full gear building heavy earth moving equipment for the military. It's estimated that two-thirds of the earth movers built by the US during the war were manufactured by LeTourneau; it is worthwhile to remember in this context that General Dwight Eisenhower said the bulldozer was one of the four primary tools for winning the war, along with the landing craft, the 6x6 truck, and the Douglas DC-3 transport aircraft. The idea of developing earth moving machines with self-powered wheels had to be put aside for the time being.

In 1953, LeTourneau sold off most of his operation to Westinghouse Air Brake. He retained two factories, but the sale agreement restricted him from building earth moving machinery in competition with Westinghouse for five years, and so he turned to development of such items as log stackers and missile transporters. When he heard that the US Army was interested in obtaining a large ground transport to resupply the radar stations of the Distant Early Warning Network in the far north, he got in touch with the Army and discussed ideas.

The US Army Transportation Command came up with specifications, and in 1955 the LeTourneau plant in Longview, Texas, produced the "LCC-1" prototype to meet them. It consisted of a four-wheel control vehicle linked to three four-wheel cars, with a total length of 53 meters (174 feet). All the sixteen 3 x 1.2 meter (10 x 4 feet) wheels were self-powered. The control vehicle had two articulated compartments, including a heated driving / living compartment in front that could accommodate three crew, and a rear engineering compartment with a 450 kW (600 HP) diesel engine, electric generators, fuel tank, plus a crane to help deal with wheel changes and linking or unlinking cars.

The LCC-1 was evaluated on sand dunes and snowdrifts, and shipped to Greenland for severe Arctic testing as the "Sno Train". All went well, paving the way for development of the Overland Train Mark II. Its general configuration was along the lines of the LCC-1, but along with an improved control car, with accommodations for six crew, it had ten cargo cars and two power generator cars fitted with gas turbine engines. It could be handled by a single driver, with the cars all following the control vehicle to turn when they came to the same spot.

The Overland Train was evaluated intensively in the early 1960s. It did well in the evaluations, but then the Army got cold feet. Money was short, the Overland Train promised to be expensive, and the Army didn't have a cadre of mechanics who could maintain the thing properly. The program was abandoned. Most of the hardware was junked, though the control cab survives. LeTourneau still builds giant machines with individually driven wheels, but nothing exactly on the scale of the Overland Train.

BACK_TO_TOP

[TUE 13 SEP 05] SPACEFIGHTERS

* SPACEFIGHTERS: As discussed in an article in AVIATION WEEK ("Space Warriors" by William B. Scott, 7 March 2005), the idea of a "Space Combat Force" might sound like something out of a bad cartoon show, but in a sense such a thing does exist, in the form of the US Air Force (USAF) "21st Space Wing (21CW)". Unlike most Air Force combat wings, the 21st CW is distributed over the globe in 26 installations, the largest of them being Peterson Air Force Base (AFB) in Colorado, the wing's headquarters; Cheyenne Mountain Air Force Station (AFS), also in Colorado; Thule AFB in Greenland; and Clear AFS in Alaska.

The primary mission of 21CW is space warning, using the Defense Support Program (DSP) missile early warning satellites and a network of ground-based radars. The DSP spacecraft pick up the infrared signature of missile launches and report the launches to the US / Canadian North American Aerospace Defense Command (NORAD) and Strategic Command centers inside the buried Cheyenne Mountain complex.

The primary radar systems include PAVE PAWS (Pave Phased Array Warning System), BMEWS (Ballistic Missile Early Warning System), and PARCS (Perimeter Attack Radar Characterization System). The radar data is consolidated at NORAD along with the DSP information to provide a cross-check, and to verify trajectory information. In October 2004, the US Navy's Space Surveillance System was transferred to the 21CW. The "Fence", as it is known, was put into operation in 1959; it consists of three transmitter sites and six separate receiver sites spread across the southern flank of the continental US. The Fence acts as a screen and can spot objects to high altitude; work is underway to integrate it into the system.

BMEWS radar at Thule

The three BMEWS radar systems, which are at Thule, Clear, and Royal Air Force Fylingdales air base in the UK, also support the 21CW's "space surveillance" effort, which keeps track of more than 8,500 space objects in a "space catalog". The catalog emphasizes objects with a cross-section of a meter or more, but it includes items of special interest down to sizes of a centimeter. The 21CW notifies the US National Aeronautics & Space Administration (NASA) of threats posed by objects to NASA flights. NASA has modified shuttle flights a dozen times and moved the International Space Station six times to avoid collisions. The 21CW also obtains images of orbital objects using the "Ground Based Electro-Optical Deep Space Surveillance System (GEODSS)". GEODSS isn't really adequate to track the new microsatellites now being launched, so it is being upgraded with a "Deep Stare" capability to improve resolution.

The 21CW not only maintains a space "situational awareness" for the US, it also can take defensive actions to help protect US space assets against potential threats. Options for offensive actions are also available, but they are kept quiet because they are controversial. Commanders of the 21CW believe in the importance of the job they are performing and see it expanding in the future. This makes for an exciting challenge, but the brass is frustrated by low budgets, and aging equipment that make it harder to get the job done right. Eventually, they hope to create a space data system that can provide a realtime "map" of all activities on a display, listing and categorizing all spacecraft and incorporating data on "space weather".

BACK_TO_TOP

[MON 12 SEP 05] THE NORDEN & SPERRY BOMBSIGHTS (2)

* THE NORDEN & SPERRY BOMBSIGHTS (2): The Norden bombsight had potential competition. The Sperry Gyroscope Company had been founded in 1909 by Elmer Sperry, a genius who essentially founded the modern science of feedback control systems. Sperry started out developing stabilization systems that would allow warships to fire accurately in heavy seas, and quickly went on to control systems for aircraft. The Sperry company was granted a patent for a gyrostabilized bombsight as far back as 1914, and in the following decades produced a series of improved bombsights that led to the "Sperry O-1" of 1933. However, the Sperry bombsight was by no means obviously superior to the Norden design, and the US military was committed to the Norden design.

The USAAC did have some problems with the Norden bombsight. One was purely bureaucratic. Since Norden could only make bombsights for the US Navy, the USAAC had to order bombsights through the Navy, and of course this made life complicated for all concerned. The Norden bombsight was also designed for Navy flying boats, such as the Consolidated PBY Catalina, which flew at relatively low speeds and medium altitudes. USAAC bombardiers had to "fudge" the settings of the bombsight to get good results. The troubles came to a head in January 1936, after the Norden Company ran into production problems and couldn't deliver bombsights on schedule. The Navy then refused to supply bombsights to the USAAC until the Navy's own needs were met. The commander of General Headquarters (GHQ) Air Force, Major General Frank M. Andrews, initiated contacts with the Sperry Company.

By 1937, Sperry had developed new technology that promised substantial improvements in their bombsight design. A star electrical engineer named Orland E. Esval came up with a new gyroscope that had greater mass than the gyroscope used in the Sperry O-1, and spun at a rate of 30,000 revolutions per minute (RPM). The new gyroscope was far more stable than Norden or earlier Sperry designs. Another Sperry engineer, a newcomer named Carl Frische who eventually became company president, worked with Esval to devise a "self-erecting" system for the new gyroscope. The self-erecting system eliminated the need to fiddle with bubble levels. As long as the aircraft was flying straight and level, the gyroscope would find the vertical when the self-erecting system was turned on. It was turned off during flight maneuvers that might affect the level setting. The new gyros were self-lubricating and induction powered, meaning they didn't have the nasty carbon brushes. They also developed a scheme by which movement of the azimuth gyro could be sensed electromagnetically, and fed into an electronic feedback system to keep the bombsight stable.

These innovations led to a new and much improved bombsight, the "Sperry S-1". However, the S-1 required an AC power source, and to that time aircraft instruments had only used DC power. This led to the development of AC power systems for aircraft, operating at 400 Hz. The rate of spin of the gyroscopes was synched to the AC rate, meaning their RPM dropped to 24,000 in practice, but the lower RPM caused no real problems.

* The development of an improved bombsight also led to the development of an improved autopilot system. Sperry had been working on autopilots for decades, and in fact some early models of the Boeing B-17 Flying Fortress were fitted with the Sperry A-3 commercial autopilot. The A-3 was based on pneumatic-hydraulic servo systems and had sluggish response. It tended to overcompensate in rough air, causing the aircraft to oscillate. The Norden Company had a comparable electromechanical autopilot, known as "Stabilized Bombing Approach Equipment (SBAE)".

A bomber needed a good autopilot to ensure straight and level flight up to bomb release. To this end, Frische developed the first all-electronic autopilot, the "Sperry A-5". The A-5 featured three vacuum-tube amplifiers, one each for the aircraft's pitch, roll, and yaw axes, that amplified displacement signals sensed from the autopilot's gyroscopes. The electronics also obtained velocity and acceleration from the three displacement signals. The amplifier outputs controlled electrically actuated hydraulic servomechanisms. System response was very fast, keeping the aircraft very stable. The S-1 bombsight was electronically linked to the A-5 autopilot. When the bomber approached the target, the pilot essentially turned control of the aircraft over to the bombsight. The bombardier kept the bombsight crosshairs fixed on the target, and the bombsight not only kept the aircraft on course, but also released the bombs at the proper time.

Bombing accuracy was extremely impressive, and in June 1941, the USAAC awarded Sperry a major contract for the S-1 bombsight and the A-5 autopilot. The Air Corps like the A-5 autopilot so much they wanted to standardize on it, and talked to the Norden Company about working with Sperry to ensure that the Norden bombsight could work with the A-5.

* The Norden Company was uncooperative. Norden, to no surprise, was very unhappy with competition from Sperry and lobbied against the Sperry Company. Theodore Barth proved very useful in this effort, since he had many contacts in the Army and Navy and was a super salesman. Barth and other Norden advocates had successfully managed to cast an air of mystery around the Norden bombsight, making it sound like a Buck Rodgers marvel weapon. Despite the fact that by the start of the US entry into the war the Norden bombsight's security classification had been reduced from "top secret" to "confidential", Air Corps security procedures concerning the bombsight approached the silly. Bombsights were kept under lock and key between missions, and the "football", as it was called, was escorted between the lockup and the bomber under armed guard. Bombardiers had to take a formal oath to protect it with their life.

In fact, even radio serials played up the Norden bombsight. JACK ARMSTRONG, THE ALL AMERICAN BOY, promoted the "Secret Norden Bombsight" toy, which was a little wooden box with a mirror arrangement that allowed the user to pound cardboard-cutout U-boats with little red bombs. In contrast, the Sperry system was kept under very tight security, with no publicity, and in fact Sperry Company kept their production of bombsights a complete secret. Sperry also suffered from the fact that they were an international company. Norden only worked for the US Navy, but Sperry had granted technology licenses to Germany and Japan before the war. Some of the government brass considered this almost treasonable, and Norden was quick to hammer on this weakness.

Sperry managed to hold their own through their own sales manager, Fred Vose, and the good graces of Major General Andrews. Unfortunately, Vose was killed in a stateside plane crash in April 1942, and in Andrews was killed in a crash in Iceland in early 1943. Norden had more momentum and a strong sales pitch, and in August 1943 USAAF brass made recommendations that the Air Force standardize on the Norden bombsight. On 22 November, the Air Force canceled their contracts with Sperry, and Sperry shut down S-1 and A-5 production. The USAAF flew with the Norden bombsight and C-1 autopilot to the end of the war.

* Postwar analysis showed that the effectiveness during the war of high altitude precision bombing had been exaggerated. The Norden bombsight could do the job when the weather was clear, but Northern Europe tends to be cloudy. Furthermore, the Germans proved resilient in adjusting to air attacks and bomb damage. The Sperry bombsight wouldn't have changed the results much. However, the A-5 autopilot was a major technical advance that would lead to improved autopilot systems after the war. [END OF SERIES]

PREV
BACK_TO_TOP

[FRI 09 SEP 05] LIQUID TREASURE (1)

* LIQUID TREASURE (1): As discussed in a survey in THE ECONOMIST ("Priceless: A Survey Of Water" by John Peet, July 19:25 2003), water simultaneously seems like a common substance and a precious one. There's plenty of water on the Earth, with two-thirds of the planet's surface covered by it to a considerable average depth. Of course, the overwhelming portion of this water -- 97% -- is salt water, which can't be directly used for drinking or agriculture. Of the 3% of water on the planet that is fresh, two-thirds of it is locked up in ice and glaciers.

That 1% of the Earth's water that's useful to humans is still a lot of water, and it would seem there's plenty for everyone. The big problem is that water isn't distributed in anything like an average fashion. Canada has more fresh water than the country can use, while Australia is baked and dry. The distribution of water also has a time component. In places like Bangladesh, the rains fall heavily for part of the year, sometimes causing disastrous floods, and hardly fall at all for the rest of the year.

Of course, such variations in the distribution of water can be addressed with improved hydrotechnology, though as with all technological solutions the simple answer sometimes turns out to be more difficult than expected. More significantly, the technical solution almost inevitably involves politics and economics, which of course have the effect of making a difficult matter even more difficult. Politicians understand that people feel they have a right to water, and so it is generally priced much lower than a reasonable accounting of the actual overhead of supplying it actually justifies, with the difference made up with public funds. To an extent it is hard to argue against such a policy, since it is clearly unjust to force the poorest citizens to bear the full cost of an absolute necessity of life.

At the same time, however, it is not necessarily the poorest citizens who are the greatest beneficiaries of subsidized water services, nor do such services serve them very well. Government subsidized water is most heavily used by industry and, particularly, agriculture. In developed countries, irrigation accounts for about half of all water use. In undeveloped countries, it accounts for over three-quarters.

Even that doesn't sound like a monstrous evil at first glance. If farmers need that much water to grow crops, then there's no sense in not letting them have it. However, the artificial pricing policies for water have a distorting effect on markets, encouraging waste, for example leading farmers to plant crops that would be totally unsuited to the land they're planted on if water wasn't so cheap, and often uncompetitive with the same crops grown in countries where the land is well-suited to them.

Activists may complain about this wastefulness, but their pressures on government have helped create the situation. As mentioned, the public is hostile to the idea that everyone should bear the actual costs of water. Given this mindset, it then becomes a bureaucratic and legal nightmare to pick and choose among those who have a just right to low-cost water and those who don't. In such an environment, the wealthier lobbyists have the edge.

Another complication introduced by activism is that profiting from water is seen as evil, though as one major water-company executive put it: "God provided the water, but not the pipes." The reality is that the biggest problem is not really overpriced water, but no water, or foul water. At least a billion people on this planet have unacceptable access to water, and such water that they do have is often unhealthy. Private industry could help solve that problem if a profit were available at the end of the rainbow, but a public prejudice against profits works against that goal.

In sum, then, water supply is a complicated matter, involving not merely technical issues, but an extremely tangled tug-of-war between private and public interests. However, those involved say it is a situation with a solution. The Johannesburg Earth Summit of August 2002 set a goal of providing clean water to half the world's people who currently lack it by 2015. Water officials believe this is a perfectly realistic goal. [TO BE CONTINUED]

NEXT
BACK_TO_TOP

[THU 08 SEP 05] MEASUREMENT AT THE EXTREMES

* MEASUREMENT AT THE EXTREMES: As discussed in articles from SCIENCEMAG.org ("Measurement & The Single Particle" by Andrew Watson, and following, 19 November 2004), engineers have an old saying that "precision begets precision", meaning that improved technologies become available as a result of the development of more precise tools to build them, and the reverse. The end result has been to push precision up towards its limits.

Precision implies measurements, and measuring at the limits is challenging. For example, right now the unit of mass known as the kilogram is defined by a specific bar of platinum, kept locked away in Paris. Obviously dependence of a unit of measure on an arbitrary object is unsatisfactory, and it would be less arbitrary to base it on, say, a specific number of gold or other atoms, so that anyone could duplicate the standard anywhere. That means figuring out ways to count individual atoms, and work is in progress towards that goal.

Another precision measurement effort is being conducted by the US National Institute of Standards & Technology (NIST). NIST researchers have developed an "electron pump" that can shove electrons one at a time into a capacitor. The capacitor's voltage can then be measured and the electrons pumped out again, one at a time. The NIST electron pump isn't a useful applications technology, but it's a fine tool for fundamental measurements of current and capacitance.

The core of the NIST electron pump is the "single electron tunneling (SET)" device, which at least conceptually looks very much like a field effect transistor (FET), with a current source connection, current drain connection, and gate control electrode. However, the conventional channel of a FET, placed between source and drain and under the gate, is replaced by an "island" of conductive material, linked by "tunneling junctions" to the source and the drain. A tunneling junction is basically an insulator wall through which electrons shouldn't flow, but the junctions are very thin and, due to the probabilistic laws of quantum physics, there is a small probability that an electron will magically pop through one. Carefully manipulating the gate voltage will gate through electrons one at a time.

This is actually an oversimplified description. The probabilistic laws of quantum physics mean that it is impossible to always ensure one electron will pop through. There is a relatively small probability that two will pop through and a similar probability that none will. To reduce such errors to a low level, multiple islands are connected in series, each with its own gate electrode and separated by tunneling junctions. The NIST electron pump has seven junctions, reducing errors to one in a hundred million electrons.

Similar SET technologies are being used to measure instead of control electron flow, and Finnish researchers are using SET technology for extremely precise temperature measurements, since its conductance changes precisely with temperature. It only works below 30 degrees Kelvin, but there is currently a need for a precision low temperature measurement scheme, and SET technology might do the job.

* Another target for precision measurement is the "phonon", the quantized lattice vibration of a crystalline solid that transports heat through the solid. Not too surprisingly, the number of phonons increases with temperature: at room temperature, a grain of salt contains about 10^18 phonons. A group at Britain's National Physical Laboratory (NPL) has come up with a scheme for measuring single phonons, bridging between the hyperfine needle tip of an atomic force microscope and a surface with a carbon nanotube about a nanometer in diameter. When the probe is placed on a surface, phonons will travel up and down it one at a time, resulting in a quantized heat transfer. NPL researchers think their scheme might be used as an ultra-accurate thermometer.

Photons are yet another target for single-event measurements, but trying to sense single photons over a wide range of wavelengths (and, equivalently, energies) is troublesome. A collaborative group of researchers at the University of Oxford, Harvard, and the University of Naples has developed a broadband detector, capable of picking up photons all the way from the infrared to the soft X-ray regions of the spectrum, using a superconducting layer. Superconducting occurs when electrons are linked by phonon exchanges into "Cooper pairs", with the properties of the pair being radically different from the properties of the individual electrons. An incoming photon breaks up a Cooper pair, with the two electrons tunneling into a second layer, where they create a cascade of current that can be measured.

NIST researchers have built a detector for infrared photons based on a thin film of superconducting tungsten, but it works on different principles. When hit by a photon, the tungsten heats up and ceases to superconduct, then cools right off and becomes superconducting again. The change in resistance can be used to detect the photon, and the device is fast enough to measure 20,000 photons every second.

Another NPL group is working on a photon detector based on "superconducting quantum interference device (SQUID)" technology. A SQUID is two layers of superconductor separated by a tunneling junction, and its resistance is very sensitive to magnetic fields. The NPL group embedded a photon-absorbing material inside a SQUID; when a photon hits the absorber, it warms it a bit and changes its magnetic properties, with the change measured by the SQUID. The NPL researchers believe that they will be able to sense photons over a very wide band, if they can find the right absorber.

* Tunneling and superconducting technologies are not the only approach being pursued for extreme measurements. Another approach is to use "nanoelectromechanical systems (NEMS)" to do the job. NEMS are small, conceptually simple machines carved out of silicon by the same processes used to build integrated circuits. These processes have been refined to provide features down into the nanometers, opening up some interesting possibilities for ultraprecise measurements.

Researchers have focused on the possibilities of tiny silicon cantilevers, little "diving boards" on the order of 100 micrometers long and 100 nanometers thick. At such dimensions, a cantilever beam will vibrate at very high frequencies, up to a gigahertz. One scheme in the works uses such a vibrating beam to transfer electrons from one plate to another. Another measures the mass of a particle of some sort attached to its tip through the fact that the resonant frequency of the beam will change when it is loaded down; the technology can measure masses down to 10-18 grams and could be used, for instance, to measure the mass of a virus. A third scheme involves fixing a tiny bead of magnetic material to the tip of the beam and using it to perform ultraprecise measurements of magnetic fields.

Researchers using NEMS are up against the physical constraints of the nanoscale. The smaller things get, the harder it is to precisely measure or define dimensions, "like trying to measure distance on a foam mattress", as one worker describes it. The resonant frequency of a vibrating beam will change drastically with variations in length at such scales, and one of the objectives of this research is to come up with schemes of calibration and self-adjustment.

* Precision measurement of time is another frontier for extreme measurement. All clocks are based on some sort of oscillator and a means for measuring the cycles of that oscillator. The classic example is the pendulum clock or hairspring watch, with a mechanical oscillator that was regulated and measured by a gearing system. Such clocks are influenced by changes in temperature, humidity, and local gravity; although the best of them achieved remarkable accuracies, they would be hard-pressed to keep time better than to a few seconds drift a month.

Modern electronic technologies introduced the electronic oscillator based on the quartz crystal resonator, and this technology has largely replaced the mechanical clock. It can achieve high accuracies, particularly if the quartz crystal is kept at a constant temperature in a vacuum, but it's still not a technology that approaches the accuracy needed for extreme measurements.

In 1945, the American physicist Isaac Isidore Rabi suggested a clock based on quantum-physical principles that could provide unprecedented accuracy. He knew that cesium atoms had an energy resonance state at exactly 9.192631770 GHz, and that they would fluoresce when excited by microwave energy at this resonance. In Rabi's "atomic clock", a microwave source is adjusted until it causes a sample of gaseous cesium atoms to fluoresce, and then the microwave cycles are counted as the "ticks" of the clock. When the first atomic clock went online in 1949, it was accurate to within 10^-10 seconds. NIST currently operates a cesium-based atomic clock that is accurate to less than 10^-15 seconds. If such a clock was put into operation when the dinosaurs were wiped out 65 million years ago, it would have lost about a second of time by now.

Extreme measurement researchers think they can do better, much better, but of course it's a challenge. The central idea is to use lasers, which have much shorter wavelengths than microwave sources and so much higher "tick" resolution, to stimulate atoms held in laser traps that "cool" atoms into immobility. Cesium doesn't work all that well in this scenario, however, since the laser trap tends to interfere with the "clock" laser beam. Other atoms or combinations of atoms are being considered instead; one approach uses a beryllium atom and aluminum atom arranged in a "quantum entanglement" where they share a common quantum state. Researchers think they can get down to accuracies of 10^-18 second or less, but they admit they have their homework cut out for them.

* Lasers are very useful for extreme measurements, able to stimulate atoms at precise wavelengths and energies with very short bursts of light, able to act like a camera with an ultrafast shutter speed to catch internal processes in atoms. Researchers have been trying to push the pulse lengths down as far as possible, and are now shooting for pulses with lengths on the order of 10^-18 seconds.

An optical-wavelength laser can't be used to do this, because a single wavelength would be longer than that period. A short-wavelength extreme ultraviolet (XUV) source is required instead. A number of research groups have taken an indirect approach, using an infrared laser to stimulate atoms of a rare gas. The electron will emit light at a higher multiple or "harmonic" of the laser frequency, which ends up in the XUV. The problem was that the atoms would emit a series of XUV pulses; to be useful for measurements, only one pulse can be emitted.

In 1999, a Swiss researcher proposed a feedback technique in which the laser wavelength was adjusted in response to atom behavior. It wasn't trivial to do, but in 2003 a German-Austrian group managed to get it to work. The problem then was to prove that it was working: it was hard to generate such short pulses, and it was tricky to measure them. The group ended up developing a toolkit of indirect techniques, and described their toolkit in the science press. A number of other research groups are now using it for their own research efforts.

* If all the researchers working on extreme measurements reach their goals, what then? The answer is obvious: go to the next level and improve the accuracy of measurements by a factor of 10, 100, 1,000, or whatever. Physical limits of course set in -- counting less than one electron might be very tricky, but given the bizarre way quantum mechanics works it doesn't seem exactly out of the question -- but until they do, somebody will keep right on pushing up to them.

BACK_TO_TOP

[WED 07 SEP 05] THE CLOCKWORK ANDROIDS

* THE CLOCKWORK ANDROIDS: As discussed in an article from SCIENCE magazine some years back ( "The Clockmaker's Androids" by James Hansen, July-August 1982), the idea of the mechanical man has been around a long time. As prominent evidence of just how old the notion is, the Museum of Art and History in Neuchatel, Switzerland, has three marvelous mechanical men -- or rather, two cherubic mechanical boys who appear to be about three years old, and a much more lifelike teenaged girl of about 16 or so. They write, draw, and play the organ -- impressive feats, considering they run on clockwork and are over 200 years old.

Neuchatel has a long tradition of watchmaking, and the three clockwork androids were built between 1768 and 1774 by one of the city's finest watchmakers, Pierre Jacquet-Droz, and his two sons. Jacquet-Droz had been born in 1721 into a well-to-do farm family, and eventually was sent to the University of Basel for studies in the Protestant ministry. In Basel, he found himself presented with many distractions. Clockwork marvels were popular in those days, and often presented in traveling shows; the most famous of them was a mechanical duck built by a Frenchman, Jacques de Vaucanson, that could swim, smooth its feathers, eat kernels of corn, and excrete what looked like bird droppings.

What happened to Jacquet-Droz in Basel is not really known, but when he came back home, he turned to watchmaking instead of the ministry; over two decades his clockworks became famous all over Europe, with his clientele including much of the European aristocracy. Some of his clocks had mechanical contrivances like singing birds, but Jacquet-Droz wasn't satisfied with such modest achievements: in 1768 he began work on the first of his clockwork androids, one of the little boys, who would be named "Charles Jacquet-Droz, the Scribe". It took Jacquet-Droz and his sons until 1772 to complete Charles.

Charles sits at a desk; he's made of wood and brass, and is brought to life by winding him up. He then carefully writes out messages of up to 40 characters (including spaces; the messages may have multiple lines), occasionally dipping his quill pen in an inkwell. His head and eyes follow the text as he writes with his right hand, while his left hand pulls the paper across the desk to ensure proper spacing. Charles is a mechanical nightmare. The form of the letters is controlled by a set of brass cams, and the message is defined by a replaceable metal disk, which takes six hours to "program" and requires painful tolerances. Changes in room temperature can cause him to make spelling mistakes.

The second clockwork android was "Henri, the Draughtsman", similar in appearance to his "brother" Charles, but whose talent is sketching. He can draw a picture of a boy with a butterfly; Louis XV of France; George III of England and his wife; and a little dog. When finished with a sketch, Henri bows his head and blows to clean the paper.

The third clockwork android was "Marianne, the Musician", the most realistic of the three. A bellows system gives her the subtle appearance of breathing, and she possesses small but convincing mannerisms that operate independently of her prime function: playing five songs on a small pipe organ. It is a real organ, her fingers actually play the keys, and she nods to acknowledge applause.

Jacquet-Droz's clockwork androids

Henri and Marianne are much less sophisticated than the fully reprogrammable Charles, and were constructed much more quickly. They were all still crowd-pleasers, which is apparently the reason they were built: they were exhibition pieces, designed to show off the skill of Jacquet-Droz and his two sons, and make a profit in themselves. It also seems that the Musician was a tribute to Jacquet-Droz's only wife Marianne, who had died after the birth of a baby girl, who only lived a few months in turn.

The family took their creations on European tours, where they excited great interest, attention, and stories, one of which relates an embarrassment: while on a visit to the court of Louis XVI and Marie Antoinette, Henri the Draughtsman was to be programmed to draw Louis XV, but was set to draw the little dog by mistake. The court was not amused.

The European social upheaval that followed the French Revolution in 1789 greatly harmed the luxury trades, and Jacquet-Droz was forced to sell his mechanical family to a French firm in Madrid. The clockwork androids passed from buyer to buyer for over a century until, around 1900, they were located by the city of Neuchatel and bought at great expense, the funds being obtained by a public subscription. They were presented to the museum, and would later go on world tours. A modern generation, familiar with the kitschy and amusing "audio-animatronic" figures devised by Disney, might be forgiven if they were unimpressed by Jacquet-Droz's clockwork androids, but they still remain a marvel: clever and elegant in themselves, and, in the lifelike form of the pretty Marianne, inspiring an eerie sense of the passion of a master clockmaker for his lost young wife.

BACK_TO_TOP

[TUE 06 SEP 05] CRANBERRIES AS TECHNOLOGY

* CRANBERRIES AS TECHNOLOGY: As discussed in an article in THE ECONOMIST ("Red, Round, & Profitable", 18 December 2004), we tend to take the food on our table for granted, and it always comes as a bit of surprise to find out just how much of a technology food is. One particularly interesting case in point is the cranberry, normally considered a holiday accessory but which is now a big business.

The cranberry is one of the three fruits, along with the blueberry and Concord grape, native to North America and not known to Europeans until they settled the New World. The Pilgrims found out about the bitter red berries from the local Indian tribes, who used them for food, as medicine, and for dyes; the newcomers called them "crane berries" because the plant's pointy blossoms could be seen, with a little imagination, as resembling the head of a crane.

Cranberries picked off the plant tend to be overly bitter, but the settlers later found that boiling the berries with maple sugar made them quite a treat. After the American Revolution, the US Navy bought cranberries by the barrel to ward off scurvy on long ocean voyages. They were expensive, however, since at that time nobody had figured out how to cultivate them. They only grew in small boggy potholes where the soil was very acidic and was a combination of peat and clay.

It wasn't until 1816 that a Cape Cod farmer named Henry Hall found out that the plants grew very well in soil covered by sand. This was the innovation that got cranberry production started, and it was followed by others. The second innovation was, according to legend, invented by John Webb, who introduced cranberry cultivation to New Jersey. Webb had a pegleg and found it difficult to carry his crop down into his cellar for grading and preservation; he simply set up a board ramp and poured them down. One of the useful features of the cranberry is that it has a rubbery, tough skin, and so good berries would bounce all the way down to the bottom with little harm, while ruptured berries would simply slide down a way and stop. Improved "bouncing" schemes are still used to sort cranberries.

The third innovation wasn't introduced until the 1960s. Up to that time, cranberries had to be picked by hand, which was a labor-intensive process, but cranberry farmers started laying out their plots so they could be flooded. Once the fruit was ripe, the water was let in, and a machine agitated the plants to free the berries, which would float to the surface of the water in their tidy rubbery skins, allowing them to be efficiently scooped up. "Wet" harvesting also protected the plants from snap frosts.

loading cranberries

* Even with such innovations, cranberries wouldn't have been any more than a marginal sort of foodstuff if three cranberry growers hadn't decided in 1930 to pool their resources in hopes of fending off the Depression. One of the three, John Makepeace, was already selling a brand of canned cranberry sauce with the appealing, if not exactly intuitive, tradename "Ocean Spray", and the group decided to adopt that as their name.

Ocean Spray became a success story. It is a cooperative, currently with about 800 members, instead of a traditional company, but it does big business, with about $1.2 billion USD in sales in 2003. How it got to be such a big business began with a disaster. Ocean Spray had been conducting a good if not spectacular business with cranberry sauce for decades up to 1959, when the operation was run down by an environmental scare. A report revealed that cranberries had been found with high levels of pesticides, and though the cranberries affected hadn't been grown by any member of the Ocean Spray group, the end result was that nobody bought cranberry sauce for the holidays that year.

To fend off similar catastrophes in the future, Ocean Spray management decided that it would be wise to sell a product that customers would want year-round, not just during the holidays. The obvious product was juice. Producing juice is generally a tricky business, since juices tend to go brown and also produce an ugly sludge after a time. Cranberry juice tends to be unusually stable in this regard, but it still wasn't until 1963 that Ocean Spray was able to put their cranberry juice on the market. Getting people to buy a new product is tricky too, but along with a strong consumer advertising campaign, Ocean Spray went to bartenders, showing how cranberry juice could be used in tart mixed drinks, and to doctors, selling the juice on its healthy properties.

Ocean Spray retains leadership in the cranberry juice market, with high brand identification and customer loyalty, and the group does its best to stay there with product innovation, such as cran-apple and cran-grape juice. Mixed juices are not the only tactic. In the 1980s, Ocean Spray started selling dried cranberries to bakers and cereal manufacturers. Of course, drying out cranberries meant removing the juice, which was part of the manufacturing process for the juice line in the first place, but traditionally that had been done by crushing the berries, A new process was invented, in which distilled water was pushed into the berries, forcing the juice out and leaving behind a whole hull.

Some time later, it was realized that other fruit juices, from such items as blueberries or raspberries, could be injected into the hulls using much the same process. Such "flavored fruit pieces (FFPs)" bake well and preserve well, and they are also available year round at a relatively constant price; being halfway artificial, they don't have to be "in season" as much as other fruits. Ocean Spray researchers have come up with a gelatinizing process that allows FFPs to have a shelf life of two years, making them perfect for breakfast cereals. With such innovation, cranberries have come a long way from simply a holiday treat to complement a turkey dinner. The 2004 harvest was five times that of 1960, and Ocean Spray doesn't feel they've reached the limit on growth yet, either.

[ED: I confess that I am addicted to Ocean Spray cran-grape juice. I used to really like cran-strawberry, but apparently it didn't sell and I haven't been able to find it for years.]

BACK_TO_TOP

[MON 05 SEP 05] THE NORDEN & SPERRY BOMBSIGHTS (1)

* THE NORDEN & SPERRY BOMBSIGHTS (1): As discussed in an article from IEEE SPECTRUM from some years back ("The Bombsight War: Norden Vs. Sperry" by Lloyd Searle, September 1989), World War II brought the full introduction of the concept of strategic bombing, developed in the decades before the war. Fleets of bombers plastered factories and cities with carpets of bombs in savage air battles. While the British Royal Air Force (RAF) pursued area bombing at night, the US Army Air Forces (USAAF) pursued daylight precision bombing. One of the keys to precision bombing was the top-secret Norden bombsight, invented by Carl H. Norden, which was proclaimed as so accurate that it could put a bomb in a pickle barrel.

Norden bombsight

In fact, the Norden bombsight was based on relatively old technology, and the USAAF had access to a much more sophisticated bombsight, built by the Sperry Gyroscope Company. The Norden bombsight was an electromechanical system; the Sperry bombsight was one of the first such devices based entirely on electric servo systems. The Sperry bombsight was simpler, easier to use, and more effective than the Norden bombsight, and the autopilot associated with the Sperry bombsight was the basis for later autopilot technology. However, although the USAAF was interested in the Sperry bombsight and awarded Sperry large production contracts, the Norden Company proved to be better at sales, and the Sperry contracts were canceled. The Norden bombsight became the oversung hero of the precision bombing offensive -- which was never anywhere as precise as its advocates claimed in the first place.

* A decade before the war, high-altitude bombing was regarded as impossible. Simple bombsights could be used effectively at low altitude, but at high altitude, factoring in aircraft movements, altitude, wind direction, and the aerodynamics of specific types of bombs to get accurate results was a tricky problem.

At that time, bombing was performed using a "pilot direction indicator (PDI)" scheme. The bombardier tried to sight in the target, and guided the pilot by pressing buttons that moved a needle indicator on the flight instrument panel. The pilot adjusted his course to keep the needle centered. The scheme was tricky and troublesome, particularly under combat conditions.

Carl L. Norden had worked for Sperry Gryroscope Company on shipboard gyrostabilization systems beginning in 1911, and then became a consultant to the US Navy in 1915, working on various projects. In 1921, he began working on bombsights, and in 1923 set up his own business in partnership with another consultant for the Navy, a retired US Army colonel named Theodore H. Barth. In 1928, the US Navy awarded Norden and Barth a contract for 40 bombsights, and the two formally establishment their company as Carl L. Norden INC.

Norden delivered the first prototype for the "Mark XV" bombsight to the Navy in 1931. The Mark XV provided a telescopic sight that was mounted in a gimbaled frame to keep it level and steady while the aircraft flew. It was stabilized by a pair of DC-powered gyroscopes spinning at 7,800 RPM. One gyroscope compensated for vertical (range) motions, the other for horizontal (azimuth) motions. The Norden bombsight suffered from a number of limitations:

The Norden bombsight was still much better than anything else the Navy had available, and so the Navy decided to standardize on it. The Navy also established the Norden Company as a "dedicated" source, meaning that the Navy would only buy bombsights from Norden, and Norden would only sell them to the Navy. The US Army Air Corps (USAAC, the predecessor to the USAAF) was also impressed by the Norden bombsight, and adopted it in 1934. [TO BE CONTINUED]

NEXT
BACK_TO_TOP

[FRI 02 SEP 05] REAGAN'S WAR ON TERROR (5)

* REAGAN'S WAR ON TERROR (5): One of the consequences of the Iran-Contra scandal was a shakeup in the government, with Howard Baker becoming Reagan's new chief of staff. There was a general house-cleaning. From the point of view of the war on terror, the most important effect was to effectively put Secretary of State George Schultz at the head of anti-terrorist policy.

Schultz was a hardliner who favored military action, but in fact there were no further major terrorist attacks against US targets up until the very end of the Reagan Administration. On 21 December 1988, Pan Am Flight 103, a Boeing 747 jumbo jet en route from London to New York, blew up in flight over the small town of Lockerbie, Scotland. All 259 people on board were killed, as well as 11 citizens of Lockerbie. The who, what, why, and how of the attack were very mysterious. No group claimed responsibility for the attack. Although Colonel Qaddafi had been keeping a low profile since EL DORADO CANYON, US intelligence eventually concluded that Libyan agents, possibly rogues, had conducted or at least been involved in the bombing.

A long, drawn out legal controversy followed, with sanctions pressed against Libya until, in May 2000, Colonel Qaddafi passed over two Libyan intelligence officers accused of plotting the bombing of Pan Am 103. The trial began in February 2001, and the result was the conviction of Abdelbased Ali Mohmed al-Megrahi, who received a sentence of life in prison. The other defendant, Al Amin Khalifa Fhimah, was acquitted.

* The action against Libya was characteristic of the approach used by the first Bush Administration and the Clinton Administration against terrorism: use law enforcement to track down the perpetrators and bring them to trial. The approach was based on respect for the rule of law, but it was ineffectual, as was proved when al-Qaeda terrorists struck the US on 11 September 2001. Ironically, the 11 September attacks had some roots in actions taken by the Reagan Administration. The holy war of the Afghan Mujahadin, who had been heavily supported by the US, had led to the emergence of a number of radical Islamic factions who despised Western culture and Western interference in the Middle East. The most powerful and dangerous of them was al-Qaeda.

Nobody could have predicted such a result back in the 1980s, but the general pattern of American actions against terrorism up to 11 September shows that it was usually a back-burner issue, dealt with in an improvised fashion. After that day, few had any doubt that the US had a real war on their hands, and would have to organize themselves to fight it accordingly. It is proving to be a very difficult task, but the commitment is clear: nobody is going to forget 11 September. Reporter Bob Woodward put it clearly:

BEGIN QUOTE:

These terrorist incidents ... used the tools that were available, but it was never in a coherent way. I know from talking to those people at the time, it was always: "Oh, we've got this crisis. We're dealing with the ACHILLE LAURO now," -- or "We're dealing with Qaddafi." -- or: "We're dealing with Libyan hit squads." -- or: "We're dealing with Beirut." And ... they never got in a position where they said: "You know, this is a real serious threat." -- not just episodically, but it's going to be a threat to this country throughout the administration, future administrations.

We need to organize to fight it. It can't be a back-bench operation for the FBI and the CIA. It's got to be somebody's issue, so it's on their desk every day. What do we know? What's being planned? What are the threats out there?

END QUOTE

Although the Reagan Administration's actions against terrorism appear muddled in hindsight, it is hard to judge them harshly, since the two following presidential administrations did little better. The Reagan Administration's major defense concern was the Soviet Union, and arguably their actions helped end the Soviet Union and the Cold War. Terrorism is simply difficult to deal with.

The Reagan Administration had its successes and failures, and if in hindsight the failures might seem to outweigh the successes, there was a time when Ronald Reagan was widely admired. Even many of those who distrusted his hard conservatism liked him personally, regarding him as something of like a kindly uncle with extreme political views. Few could take pleasure in the fact that the forgetfulness that seemed like a convenient excuse during the Iran-Contra hearings proved to be symptoms of oncoming senility that overwhelmed him within a few years, turning him into a pathetic invalid. He deserved better. [END OF SERIES]

START | PREV
BACK_TO_TOP

[THU 01 SEP 05] TIME SPAMMER

* TIME SPAMMER: One of the things about spam that makes it annoying, along with its quantity and intrusiveness, is its subject matter. Spam rarely offers anything anyone would really want to buy, and many of the things it does try to sell aren't mentionable in proper company, as well as obvious scams.

According to WIRED.com, certain spam has broken the mold and then some. In the summer of 2003, a programmer from Iowa named Dave Hill got a message from the address "Robby0809@aol.com" titled "Time Travelers PLEASE HELP". The message asked anyone who was a "time traveler or alien disguised as human" and offered $5,000 USD for anyone who could provide such items as an "Acme 5X24 series time transducing capacitor with built-in temporal displacement" and an "AMD Dimensional Warp Generator module containing the GRC79 induction motor". It went said that the sender's life had been "severely tampered with" and he needed "temporal reversion" to go back in time and make things right.

Hill assumed this was an off-the-wall joke. He decided to play along and answered, saying he could provide such items. He managed to get in touch with a fellow who called himself "Bob White". Other netizens were also fascinated by what they thought was a gag and got in touch with Bob White, or "Tim Jones" as he referred to himself sometimes. Nobody was sure just what was going on. Many, like Hill, thought it was a gag. Some thought the mysterious "time spammer", as they called him, was trying to get materials for a science fiction novel, while others thought it was some kind of oblique scam, possibly to collect validated email addresses for spam lists.

Hill got so much into the joke that he created a fake online store to sell items out of science-fiction stories, and shipped the mysterious Bob White a "warp generator", which was actually an old hard disk drive. Then things got more bizarre. White responded, thanking Hill and asking for more gear. Hill thought the joke was being taken a little too far and began to wonder if White was actually "a person challenged by reality and as such deserves our sympathy and support."

Hill's suspicion was correct. Bob White was traced down and turned out to be 22-year-old Robert "Robby" Todino of Woburn, Massachusetts, who on being queried about the matter admitted that he had sent out 100 million inquiries about time-travel technology since November 2001. Todino understands that the messages he sends out aren't always taken seriously, but he insists that he is of perfectly clear mind and adds: "A lot of people will say the stuff I talk about is crazy and out of this world. But I know for a fact that it is true and does exist. Untrained minds may disagree with me, but they don't have access to the sources that I do."

He does feel frustrated with the progress of his campaign, however: "It almost feels worthless now because the people who are monitoring my every move always seem to win. But it's the only form of communication I have right now." He believes that there is a conspiracy to block his efforts, for example interfering with attempts by helpful netizens to teleport a time machine to Woburn. His father, Robert Todino SR, has some concerns over his son: "What bothers me is that some people are trying to sell him equipment and take advantage of him. He's invested a lot of money into it and has been hurt by it."

The state of Massachusetts also has some concerns with Robby Todino, since his time-travel-tech spam is just a sideline. Todino is a full-time spammer, and the authorities have not been happy with his mass mailings of fraudulent ads for "free government grants" and "detective software". In 2001, Todino was hit with a $5,000 USD fine and agreed to cease and desist in sending out bogus email ads. Todino started churning out the time-travel spams shortly after that. The state of Massachusetts has been monitoring his activities, but has no comment on the time-travel spams.

A jazz-pop trio from NYC named "GrooveLily" wrote a song dedicated to "robby0809, wherever and whenever you are". It was titled "Rewind":

   Calling all aliens
   And time-traveling superfriends
   Secretive scientists with secretive client lists
   Calling all aliens

   I know you've got this practical invention
   To move a human through the fourth dimension
   I need to get my hands on the remote control of my life
   And press rewind
                 rewind
                   rewind
                     rewind

   I'm feeling paranoid
   Hanging like Harold Lloyd
   The clock face is dripping and my grip is slipping
   I'm feeling paranoid

   Ground control is tampering with my flight plan
   Shortening and hampering my life span
   I need to get my hands on the remote control of my life
   And press rewind
                 rewind
                   rewind
                     rewind

   If I could rewind my life
     If I could revise my ways
       If I could rewrite my lines
         If I could rerun some days
   I'd take back all the ugly things I said
   And all the people I misled
   Would see me looking shiny new and clean

   So
   I'll give you lots of cash
   If you take out my karmic trash
   I will need proof
     but I'll pay through the roof
   I'll give you lots of cash

   H.G. Wells me back to where I started
   Please don't let me die here brokenhearted
   I need to get my hands on the remote control of my life
   And press rewind
                 rewind
                   rewind
                     rewind

   I am calling all aliens
     I'm calling 
       I'm calling 
         I'm calling
           calling all aliens

Those not familiar with the silent film comedian Harold Lloyd will still probably recognize the great scene of Lloyd hanging for dear life off the hands of a huge clock far above a busy street. The bit about the "dripping clock" sounds like it's throwing in a little Salvador Dali as well.

BACK_TO_TOP
< PREV | NEXT > | INDEX | GOOGLE | UPDATES | EMAIL | $Donate? | HOME