may 2007 / last mod apr 2015 / greg goebel

* 23 entries including: communications infrastructure, THE MAKING OF THE FITTEST, XO security systems, domain-tasting scams, local money in Germany, biofuel versus food, mediation services, cellulosic ethanol considered, neglected tropical diseases, surrogate reproduction, recycling paper, Marine space assault vehicles, medical ID theft, nonobviousness & patents, ultracheap cars, and tort reform in the US states.

banner of the month



* GIMMICKS & GADGETS: People keep thinking up new and interesting things to do with cellphones, with an article from MIT TECHNOLOGY REVIEW Online ("Jott: Calling It In" by Wade Roush) describing the new service being offered by a Seattle startup named "Jott", founded by two ex-Microsoft employees in December 2006.

A Jott user can call a toll-free number on a cellphone, dictate a message up to 30 seconds long, and then specify recipients from a predefined lists of addresses, labeled as "myself" or "family" or "friends" or whatever. The message is transcribed and then sent via email or text messaging. Suppose a businessman is at an airport and is going to miss a flight; he can then use Jott to let all the folks he's meeting know he won't arrive on schedule, as long as he's set up the address list ahead of time. Jott is initially providing the service for free as a promotional measure, but will eventually offer a "premium service" on a subscription basis.

Jott is not an entirely new idea: doctors and other professionals have had access to call-in transcription services for some time and it's a big business, and there's nothing unusual about cellphones that can store voice memos. Other services are arising to convert and distribute voice mail messages over email in the UK and India. Eventually it may be something a cellphone user simply expects by default from a service provider.

* Americans don't tend to think of Mexico as a particularly high-tech country, but technical progress has a tendency to steamroller stereotypes, and a BBC WORLD TV report demonstrated a Mexican initiative that the gringos to the North could well envy: a "digital classroom" system that is now being used to teach five million ten and eleven year old schoolkids.

The centerpiece of the "EncycloMedia" system is an oversized digital display at the head of the class. It's touch-sensitive, so the teacher can interactively control the lesson and kids can get up to try to solve problems. The displays do not simply act as electronic slide presentations, either, with the lessons providing still and video imagery, including digital models and animations, as well as interactive lessons. Maps can displayed and overlaid with geographical or sociological data, showing, say, climate change, population density, and so on. The teacher can press an icon on the display and a "roulette wheel" will come up to select one of the kids in the class at random to answer a question.

The varieties of material available for the EncycloMedia system illustrate that it is in fact a system: by itself, the digital display would be just a gimmick, it's the multimedia lessons that it provides that makes it work. Right now, the EncycloMedia knowledge base has about 20,000 different items. A team of 400 personnel is working on developing the knowledge base, which of course will grow by multiples as other class levels are added. Even in its current state, no other nation has anything close to it, and the US, India, and China have expressed interest.

* According to BBC WORLD Online, police in crime-ridden Caracas, Venezuela, are now flying three 15 meter (49 foot) long robot blimps over the city to keep an eye on the streets with video cameras. The blimps relay their imagery to a control center in the middle of the city, where dispatchers can alert police to a crime in progress.

* According to MIT TECHNOLOGY REVIEW Online, researchers at Virginia Tech in Blacksburg, Virginia, have built a robot with a "toroidal" propulsion scheme. It looks like a tubular frame with tracks running around all the sides. The tracks are actuated by rings inside the toroid, with a ring contracting up front and expanding in the back to move the tracks. The scheme is well suited to probing through collapsed buildings and the like, since it can get traction on uneven ground and squeeze through small openings. The current system is just a demonstrator -- a practical machine would have to have onboard power, controller systems, sensors, and manipulators.

diatom-based sensor

* A short comment in BUSINESS WEEK discussed how Kenneth H. Sandhage, a professor of materials science, has used the skeleton of a "diatom" as the basis for a microtech sensor. Diatoms are single-celled oceanic plankton that create a skeleton of silicon dioxide; there may be about 100,000 species, with a wide range of skeletal configurations. Sandhage has suggested that diatoms could even be genetically engineered to produce skeletons as required for a particular application.



* OLPC SECURITY: Much has been said of the "One Laptop Per Child (OLPC)" or "$100 computer" here in the past. The OLPC -- now somewhat thankfully designated the "XO" -- features a swiveling 19 centimeter (7.5 inch) LCD screen that can switch between low-resolution color and higher-resolution black-and-white modes; a camera and microphone for video calls; three USB ports; 128 MB of RAM; 512 MB of flash storage; built-in mesh-networking wi-fi; a long-endurance battery rechargeable by a power cord or car battery; and a custom, Linux-based operating system. The XO is expected to go into production late in 2007, with Thailand, Brazil, Uruguay and Rwanda, among others, signed up for the launch.

According to an article in WIRED Online ("High Security For $100 Laptop" by Ryan Singel), considerable thought has been put into security for the XO, all the more so because kids are expected to be heavy users of the machine. The XO won't feature firewalls and antivirus software, however. Says Ivan Krstic, a computer security guru from Harvard University who's directing the XO security effort: "How can you expect a 6-year-old to make a sensible decision when 40-year-olds can't?" Krstic asked. He argues that most existing security systems simply train users to click YES.

Krstic's approach is named "BitFrost" and involves only one prompt -- turning on the camera -- obtaining security by ensuring that application programs have limited powers, running in a "virtual machine" with a limited set of permissions. A picture browser can't access the web, so even if a hacker were able to compromise the browser, there would be no way to obtain the images over the internet. Says Krstic: "Applications can no longer run rampant. Spyware becomes very, very hard. It can't spy on the keyboard. You can only spy on how a user uses their program."

Some programs will have extraordinary access rights, but only if they have been certified by a "trusted authority" -- in practice either the OLPC group or the national authority backing a local OLPC program. Users will also be able to assign special permissions to some programs, but only manually, through a control panel. Krstic feels that BitFrost is much more secure than the security systems featured on MicroSoft Vista or the latest Mac OS releases, but admits that it limits interactions between applications: "This kind of model makes it more difficult for glue between applications to be built, but 99 percent don't need glue."

The XO will also feature an antitheft system. Each XO is assigned a cryptographically-secured "lease" that will allow it to operate for a given length of time. Every day, the laptop will check into a country-specific server to see if it's been reported stolen. If so, it's turned off, if not, the lease is extended for a few weeks. For areas without internet connectivity, the lease is handled through a local server, say residing in the school where XOs are being used. Krstic is optimistic about BitFrost, but is too professional to be complacent: "I fear there is something I missed."

ED: One of the interesting possibilities of the "lease" system is its use for "pay as you go" purchase schemes. Another interesting question not addressed here is how system upgrades or application software distribution will be handled.



* DOMAIN-TASTING FEUD: One of the interesting things about the introduction of the internet has been the way the system has led to such a wide range of dodges, tricks, and scams -- not all of which are even against the law since nobody ever imagined them before, much less passed laws to prevent them.

One case in point, as discussed in an article in BUSINESS WEEK ("See Anything Odd About 'Vorizon'?" by Moira Herbst, 8 January 2007), is the phenomenon known as "domain tasting". In 2000, the Internet Corporation for Assigned Names & Numbers (ICANN), which as its name implies controls assignment of website domain names, established a "create grace period": anybody who claimed a domain name could cancel in five days and get the registration fee back. The idea seemed innocent enough, the intent being to reduce snarls when people misspelled their desired domain names and had to cancel.

The end result was not one that could have been easily foreseen. The legal department of Verizon Communications now spends a good amount of time checking the internet for domain names that are near-clones of "verizon.com" -- for example, "verizonpicture.com", "vorizonringtone.com", and "vorizoncellularphone.com". These names never have any real connection to Verizon and are registered to front companies from all over the world. It's easy for users interested in Verizon to end up on these sites, which are full of ads, with the site owners obtaining a bit of profit every time one of the users clicks on an ad. The owners simply cancel the site before the five-day grace period is up, and they are then refunded their $6 USD registration fee. However, then they often just go back and register the same names again.

Such "domain-tasting" is not a marginal phenomenon either. In 2004, on each day about 100,000 domain names would be operating on a trial basis. Now it's about four million a day, with only about 2% actually ending up being used. Companies that handle domain-name registrations are perfectly happy with this, since even though they have to pay back the money, that 4 million means $24 million USD in the bank that's earning interest for a few days, and that slush fund is being continuously replenished.

The whole scheme is perfectly legal. Big players who are getting stung by the diversion of customers to the copycat websites -- including Verizon, Time Warner, Dell, and Niemen Marcus -- are annoyed, with Niemen Marcus pressing a lawsuit against Dotster, a domain-name registration service, accusing Dotster of fraud and collusion in a matrix of domain-tasting operations. The players have been pressing ICANN to change the policy. ICANN officials reply that the organization is set up to implement policy for the community but doesn't decide itself what the policies should be. A spokesman says: "We're definitely taking action to inform the community about the issues involved, and it's up to them to decide what to do."



* THE MAKING OF THE FITTEST (8): One of the more intriguing revelations of the genetic analysis of organisms is that a pool of functionally identical genes are shared among them. Genes are sequences of the four DNA "base pairs" -- A, G, T, and C -- that code for a single protein, with "triplets" of the base pairs defining each "amino acid" building block of the protein. There are typically about 400 amino acids in a protein.

The average rate of mutation of a genome is about one base pair per 100 million base pairs per generation. Since most organisms have generation times of a year or less, sometimes much less, it would seem that in 100 million years or so random mutations would have ensured that there would be no genetic similarity between distantly related species of organisms. However, in reality there are many similarities in the genes of distantly related species. In fact, there are about 500 genes that are universal to all forms of life: since the likelihood of two different organisms deriving genes with highly similar codes from random mutations is vanishingly small, by implication these genes have been around since nearly the beginning.

However, since genes are always mutating, how could these "immortal genes" have been preserved? The answer's simple: the genes are so important that any major change in them is at the very least disadvantageous, and at worst fatal. In evolutionary terms, these genes correspond to very steep "fitness functions": organisms can't be pushed away from those functions and survive over the long run, if they can live at all. Some sources say that immortal genes are "maintained", but this is misleading because it implies a "maintainer" is keeping them in shape. No, the only thing that maintains them is the fact that changes in them don't work out for the better.

It might be argued that the reason the immortal genes have stuck around is because some entity really is maintaining the immortal genes. Along with this notion being excess baggage, there being no need to assume it and no evidence to support it, there is also the difficulty that immortal genes are only functionally identical: they produce proteins that work the same way. These proteins may have slightly different amino acids in some places, though they will share a common subset. That corresponds to variations in the gene's coding.

Furthermore, DNA's coding scheme is redundant. There are 64 different combinations of three of the four bases, and only 20 amino acids. That means that many amino acids are coded for by more than one triplet. Even two immortal genes that create entirely identical proteins could have noticeably different (but still equivalent) base sequences. This variation among immortal genes is compatible with natural selection; it would be whimsical and puzzling as the act of a consciousness.

Even more significantly, there is a branching tree of variations among the immortal genes of different species that generally tracks their taxonomic relationships: for example, the immortal genes of a plant are more similar to those of another plant than they are of human being. Furthermore, this is true of all the immortal genes of an organism: all the immortal genes of a plant are more like those of another plant than they are like those of a human. One could take any one of the immortal genes from a set of organisms and come up with, allowing for some noise, a more or less an identical tree of relationships. [TO BE CONTINUED]



* INFRASTRUCTURE -- COMMUNICATIONS (4): The classic land-line telephone system is of course only about half of the modern telephone story, the other half belonging to cellphones. The first "mobile phone" systems were introduced in the late 1940s. They were based on "frequency modulation (FM)" radio technology that had come to maturity during World War II. Attempts to use older "amplitude modulation (AM)" technology for mobile phones had not worked out very well -- AM is more prone to noise, and in particular by interference. If an AM receiver picks up two transmissions on the same band, the user will hear both of them -- which is actually useful in some circumstances, for example in police radios where it's good to hear what's going on with everybody, but which is troublesome for holding one-on-one conversations. In an FM receiver, the stronger signal blocks out the weaker one.

The early mobile phone systems were linked into the normal phone network through a central base station, usually operating using an antenna on top of a tall building, with the antenna giving communications over a radius of about 100 kilometers (60 miles). There were all of 11 channels in the band assigned to the mobile phone service, meaning no more than 11 conversations could take place at once. At first, a user had to hunt around through the channels to find one that was open, but later the open channels carried a tone to allow a mobile unit to find an open channel on its own.

The system was basically a rich man's toy, and by the 1960s its limitations were obvious to all. The US Federal Communications Commission (FCC) didn't regard mobile phones as very important and didn't want to allocate more radio spectrum bandwidth to them, but complaints kept coming in, and in 1968 the FCC transferred 14 unused UHF TV channels to mobile telephony. At the time, there were only about 70,000 mobile phone users in the USA.

* The cellular phone concept was proposed by AT&T Bell Labs in 1971, in the form of the "Advanced Mobile Phone System (AMPS)". Instead of linking mobile phones through a single powerful radio station, a network of smaller, less expensive, low power FM radio stations were distributed over the service area, forming localized communications "cells". Each cell station had a "trunking controller" that used dedicated channels to control the allocation of available channels to subscribers. All the cell stations in an area were linked through a central switching center.

The cellphone scheme allowed the service area to be incrementally expanded just by adding stations, and it permitted the reuse of the same channels in other cells without, in principle, concern for interference. Initially, an AMPS system provided 333 channels -- later expanded to 416 channels -- including control channels, though care had to be taken to ensure that the same channel wasn't used in two immediately neighboring cells since that might lead to interference.

AMPS handset

If the number of subscribers in a particular cell began to overflow the cell's capacity, the cell could be split into smaller cells by installing a set of less powerful transmitters. AMPS was an analogue scheme, applying traditional FM technology to the cell concept. AT&T proposed the idea in the early 1970s, but due to various complications, such as squabbles with the FCC over bandwidth allocation, AMPS wasn't introduced commercially until 1983. It was workable enough to inspire similar, though not compatible, analogue cellphone schemes elsewhere:

By 1992 there were 5 million AMPS handsets in operation in the US and 3 million analogue handsets in operation in Europe. However, this was still regarded as only scratching the surface of the market. Costs were relatively high and something better was needed. [TO BE CONTINUED]



* FUNNY MONEY: Much fuss was made over the introduction of the euro banknotes in the 1990s. The standardized euro was, in principle, to do away with the confusion of local currencies used by different European countries, making it easier to conduct business across borders. No more having to worry about exchanging d-marks for francs on a trip from Germany to France.

However, as reported by BBC WORLD Online ("Germans Take Pride In Local Money" by Tristana Moore), local currencies have been creeping back into circulation in Germany. In the city of Magdeburg, for example, a local office or "central bank" issues the "Urstromtaler" at a 1:1 exchange rate to the euro, with 200 businesses accepting it as legal tender and the notes supported by an online banking system. The notes are dated and "time out" after a certain interval, becoming worthless. The scheme was created by a local lawyer named Frank Jansky.

Exactly what the attraction might be of a currency that's only recognized locally and is only good for a limited time is a bit hard for outsiders to understand. Part of the explanation is that Magdeburg is in the East and hasn't really shaken off its Communist-era dilapidation. Unemployment there is running at 20% and many of the young people have gone off to greener pastures elsewhere. The Urstromtaler amounts to a cooperative arrangement to encourage commerce among those who make use of the currency. Says a local enthusiast, Joerge Dahlke: "Everyone who uses the regional currency develops a social network. People get to know each other."

It does sound like a gimmick, but Dahlke adds: "We are disillusioned with the euro, as it doesn't bring many benefits to the local community. But at the same time, we don't want to get rid of the euro completely. Our regional currency runs in parallel to the euro. Of course, we still need the euro for big purchases."

Magdeburg isn't an isolated case. According to Professor Gerhard Roesl, who wrote a report on the phenomenon for the Bundesbank, the German central bank, there are at least 16 regional currencies in Germany. Comments Professor Roesl: "The regional currencies are not really a threat to the Bundesbank, although technically they are illegal and could pose a problem. The Bundesbank tolerates the local currencies, which are regarded as a kind of 'social money'."

Frank Jansky and other backers of local currency schemes are lobbying for clarifications in the laws. Says Jansky: "The Bundesbank is keeping an eye on what we are doing. Regional currencies are still in a legal gray area. But there are other comparable financial schemes, like 'miles and more', which also pose a challenge to the status quo. We are supporting our regional economy and culture, which will benefit future generations."



* BIOFUEL VERSUS FOOD: The runaway enthusiasm for alternative fuels in this modern day and age has its skeptics, particularly with regards to the fad for corn-based ethanol in the USA. Some of the skeptics have gone so far as to suggest that production of corn-based ethanol actually consumes more energy than it delivers; that's generally regarded as a fringe opinion, but even its biggest advocates admit that the ratio between input and output isn't all that impressive.

Another aspect, as discussed in an article in BUSINESS WEEK ("Food Versus Fuel" by John Carey and Adrienne Carter, 5 February 2007) is the pressure placed by biofuel production on food supplies. In 2006, 112 US-based ethanol plants pumped out a billion gallons of ethanol, using about a fifth of America's corn crop. If all the ethanol plants being planned go into operation on schedule, by 2008 ethanol will gobble up half of the corn crop. The price of corn-based animal feed has doubled, and the price of meats is certain to go up accordingly.

corn ethanol plant

While corn ethanol may be the most dubious of the biofuels, it is only different in degree, not in kind, from its brethren. Rapeseed production is skyrocketing in Germany as demand for biodiesel ramps up, incidentally pushing up costs of protein meal and cooking oil, also made from rapeseed. In countries such as Indonesia and Malaysia, forests are being burned off to make way to palm plantations, with the palm oil to be turned into diesel. Some of the critics see the push towards biofuels as not merely an unsustainable mirage, but as a source of conflict between the haves -- with their thirst for fuel to run their automobiles -- and the have-nots -- who just want to eat.

The skeptics, however, are a minority. Advocates point out that food prices in the US are at a historic low, and a modest rise is not going to be very painful. In addition, biofuels mean a better livelihood for farmers, translating to reductions in government subsidies funded by the taxpayer: US farm subsidies ran to $8.8 billion USD in 2006, but they're expected to drop to $2.1 billion in 2007. Rural America, which has been struggling, is now recovering, improving the economic health of the nation as a whole. Some economists believe that biofuels will be a similar economic boon to developing-world countries. Agriculture is also highly competitive, with farmers able to ramp up production with surprising speed, keeping a lid on costs for the consumers. Some farmers even worry that the race to increase production will lead to a glut and force prices down.

There is still talk of biofuels reducing energy dependence on OPEC, but that's a fantasy for the moment, particularly for corn ethanol, which will never be able to replace gasoline even in principle. Everybody believes the long-term future lies with cellulosic ethanol, particularly derived from prairie grasses -- which are not only an efficient feedstock in principle, but can be grown on lands not useful for food crops. A Stanford biologist estimates that only 3.5% of the world's land could provide all the needed fuel -- compared to the 13% now used for agriculture. The big catch is that nobody's figured out how to convert the "efficient" prairie grasses into biofuels in a cost-effective fashion just yet. The biofuels game remains a big gamble: the future has both its threats and promises.

* According to a later Reuters article cited on the CNET ONLINE site, in 2007 US farmers are planting the biggest corn crop, in terms of acreage, in 63 years. The sheer bulk of corn production will help keep prices down, so the article claimed, but it's still selling at record highs.

BBC WORLD had a another report along such lines that said the world's croplands devoted to biofuels has increased by a factor of 25 in the last seven years and is now equivalent to the entire area of the UK; it continues to grow rapidly. BBC WORLD Online also had an interesting short article that described how locals on the island of Bougainville, part of Papua New Guinea, are making biodiesel from coconut oil, in essence running their cars and trucks on coconuts. Somehow the idea seems both completely sensible and slightly humorous at the same time.



* TALK THINGS OUT: According to an article in THE ECONOMIST ("Knocking Heads Together", 3 February 2007), the British Bank of Credit & Commerce International pressed a lawsuit against the Bank of England -- to suffer through 13 years of litigation at legal costs of 100 million pounds, or almost $200 million USD. The presiding judge called it all a "farce". The sensible think twice or three times before taking an adversary to court, since a civil case can become an expensive rathole down with endless time and funds can be poured -- a process only sure to enrich the lawyers, who have a vested interest in complicating matters and stretching the case out.

There is an easier way: the two banks could have gone to the London-based "Center for Effective Dispute Resolution (CEDR)", which handles 3,000 or so commercial disputes a year. About 75% of the disputes are settled in one or two days, with about 15% settled in a few weeks. There is in fact now a push towards out-of-court "alternative dispute resolution (ADR)" all over the world. CEDR, a non-profit organization, has alliances with similar organizations in the Netherlands, France, Italy, and the US. CEDR also provides consultation and training to aspiring mediators in Saudi Arabia, Russia, Finland, China, Thailand, Japan, Bangladesh, Cameroon, Uganda, Nigeria, Bosnia, and Slovenia.

There are two approaches for ADR, arbitration and mediation. Arbitration uses a mock court approach to resolving disputes, with lawyers on both sides making their cases to a neutral decision-maker. Arbitration is seen as preferable to a court case, but in many cases it ends up being just as drawn-out and expensive. In contrast, mediation simply lets the two sides talk things out, with a moderator setting some rules, keeping the discussion within bounds, and providing helpful suggestions and advice. It is a simple, direct approach, and the parties do not have to worry about corrupt judges: the parties are in control of the proceedings, no solution is imposed on them, they come up with the solution themselves.

America's main mediation service, JAMS, was set up almost three decades ago and now has more than 200 full-time mediation "neutrals" on company books. The neutrals are usually former judges, attorneys, or law professors; they currently handle about 10,000 cases a year, and the caseload is growing. In 2003 a legal scholar wrote of "the vanishing trial": in 1962, 11% of civil cases went to court, the rate is now under 2%. The number of federal tort cases that went to trial dropped 80% from 1985 to 2003.

Several US states -- including Oregon, Texas, California, and Florida -- have made mediation mandatory for certain classes of claims, allowing the parties to go to trial only if mediation fails. 1,500 US law firms and 800 US companies -- including Time Warner, UPS, General Electric, and Coca-Cola -- have established an informal alliance in which each promises to pursue ADR before taking another alliance member to court. In Europe, Finland and Denmark are thinking about rules to help promote mediation, and the European Commission is working on a directive to harmonize mediation. A UK-China business mediation center was recently set up, with offices in both London and Beijing. English courts are increasingly pushing parties towards ADR and threatening them with litigation costs if they refuse, though the authorities are not keen about making mediation mandatory. Many other countries -- including Argentina, Canada, France, Greece, Israel, and Singapore -- also push plaintiffs toward mediation.

ADR is not the answer to all problems. In some cases, a plaintiff may want to establish a legal precedent or bring a problem to public attention; mediation is no good for such purposes, since it's confidential. However, anybody who simply has a private gripe with somebody else is going to have to think sooner or later that it might be wiser to pursue mediation instead of going for a ride through the courts -- with a lawyer at the wheel and the meter turning over at a blur.



* THE MAKING OF THE FITTEST (7): The previous installment in this series demonstrated that the odds of a dark-furred mutant mouse arising from a population of light-furred mice are actually pretty good. Now let's consider how long it takes for the dark color gene to propagate though the population.

This matter is well-understood by population geneticists. The time in generations T is a function of the selective advantage of the black fur trait, S, and the size of the effective breeding population Ne:

   T = (2 / S) * NLOG(2 * Ne) 

Given an Ne of 10,000 individuals then the number of generations corresponding to given selective advantages is:

   S = 0.001:   19,807 generations
   S = 0.01:     1,981 generations
   S = 0.05:       396 generations
   S = 0.1:        198 generations
   S = 0.2:         99 generations

The calculation may be complicated by other factors, such as the odds that mice with the black mutation may not be passed down -- either the mutant gets killed off or the mutant gene gets lost in the shuffle of chromosomes. However, if the gene dies out in one pass, it will certainly come back again later, and the odds of it losing out a second time are very low.

This example of light-colored and dark-colored mice is based in the real world. In Arizona, populations of light-colored and dark-colored mice both live in the desert, with the light-colored mice in sandy areas and dark-colored mice in areas featuring lava flows. The precise mutation that causes these two variants to differ is known.

* It is absolutely critical to emphasize that evolution by natural selection takes place over an extremely long timeframe. 100,000 years is relatively "brief" on an evolutionary timescale, and in fact it's so short a period on the geological timescale that it's barely possible to make it out. However, recorded history is only about 5,000 years old -- and a 100,000 years is 20 times longer than that. The timescales over which the emergence of new organisms by natural selection takes place are difficult to conceive of.

The fossil record of these slow changes is incomplete, clearly demonstrating that there has been a successive emergence of different organisms on Earth, but not providing meticulous detail concerning the matter. Our recent ability to observe the genome does provide us with a very detailed record of the changes undergone by organisms in their journey through time to the present. [TO BE CONTINUED]



* INFRASTRUCTURE -- COMMUNICATIONS (3): In the early days of personal computer communications, PCs were linked over the ordinary telephone system with a "modem" (meaning "modulator / demodulator") that turned computer data into a set of audio tones. A user would simply command the modem to dial another modem, and once the hookup was established, the phone line was used to transfer files or to trade messages on a "bulletin board system (BBS)", the ancestor of the internet forum. The old modems were criminally slow -- 9,600 bits per second (BPS) was regarded as good.

From the point of view of the phone system, a modem was conceptually like a fax machine, merely some device using the system for purposes other than voice communications. These days, "dial-up" is still around, though the rate is more like 56,000 BPS -- but it's gradually being shoved to the side by "high speed (HS)" access. There are a range of techniques for HS access, one of the most common being the dedicated "digital subscriber line (DSL)", something like a fast local loop to hook into the digital network. Cable-TV connections also support HS access.

Even with dial-up, there's no point-to-point connection between users any more -- the dialup being merely used to link to a "server" on the internet. The digital internet system uses high-speed links, such as fiber optic connections, just as does the backbone of the phone system, but its basic operation is different. The telephone system is "circuit switched", meaning a connection is made from one user to another and maintained for the duration of the conversation. The internet is "packet switched", meaning the data communications are broken into chunks or "packets" and sent one at a time -- and not all necessarily by the same route or even in the same order. The packets are effectively "numbered" so they can be reassembled at the destination. The scheme by which it does so is called the "Internet Protocol (IP)". The system was designed to be highly fault-tolerant, the idea at the outset being to survive a nuclear attack -- the internet is based on the ARPAnet, designed by the US Advanced Research Projects Agency during the Cold War to link up ARPA research collaborations.

* One of the interesting questions posed by the split identity between voice and computer communications is: why do phone companies charge long distance while it costs nothing to log into a website in, say, Russia?

The answer is: because they can. The bandwidth of communications links in either case is so high that the cost of a long-range communications transaction is tiny. With the prevalence of the internet, however, people quickly figured out a way to use the internet to get around the voice communications bottleneck, sending digitized phone conversations as digital packets over the internet. Why not? Digitized voice is just another form of digital data -- though conversations do have to be a bit more timely than, say, a download from a website.

The scheme, known as "voice over internet protocol (VOIP) uses what are known as "virtual circuits" to try to guarantee timeliness, the idea being to make a packet-switched system look like a circuit-switched system at a higher level. With VOIP technology, long distance charges disappear. The traditional phone companies are only milking a dying cow, and the cow's going to be dead in the relatively near future. British Telecom in the UK is now converting the entire country to a purely digital system, the intent being to get rid of the overhead of a supporting a dual system. The traditional telephone system is in its last days. [TO BE CONTINUED]



* CELLULOSIC ETHANOL? As discussed in an article from AAAS SCIENCE ("Biofuel Researchers Prepare To Reap A New Harvest" by Robert F. Service, 16 March 2007), the energy crunch that arrived with surprising speed after the turn of the century has led to a worldwide interest in biofuels. The most common notions of biofuels -- discussed here a few years back -- include ethanol made from corn or sugar cane; biodiesel made from rapeseed, palm oil, or even coconut oil. Biofuels promise, at least in principle, not only freedom from uncertain oil supplies in unstable countries, but also an exit from rising carbon dioxide emissions -- biofuels take up carbon dioxide from the atmosphere, so there's no net release of carbon dioxide when the biofuels are burned.

However, current biofuel technology has a number of drawbacks. Corn-based ethanol is regarded as the worst case, with some estimates showing that it requires more energy to produce than it provides -- in itself it doesn't produce net carbon emissions, but the process is reliant on technology that does. The "negative estimates" are generally regarded as exaggerations, but even more mainstream estimates show it only provides about 50% more energy than required to make it. Furthermore, there's no way that corn-based ethanol could provide more than a tenth of US fuel needs even if all US corn production were converted to ethanol, and the increasing use of corn for ethanol is driving up US food prices. With inefficiencies in production and demand on the corn crop, corn-based ethanol isn't any particular bargain at the pump compared to gasoline.

Other feedstocks, particularly sugar cane, are more efficient, at least in the sense of providing a much higher ratio of energy out to energy in. However, sugar cane's still a crop plant, requiring good farmland and care to grow and competing against food production. Wouldn't it be nice, so the thinking goes, to be able to use as feedstocks plant material that can't be used as food, such as cornstalks, straw, prairie grasses, lawn clippings, and fast-growing trees? A joint paper produced by the US Department of Energy (DOE) and US Department of Agriculture (USDA) in 2005 suggested that the USA could obtain 227 billion liters a year, 30% of the nation's vehicular fuel needs, from 1.3 billion tonnes of waste biomass, without major impact on food or timber production. In addition, a paper published by University of California at Berkeley researchers in 2006 suggested that while use of corn-based ethanol only cuts net greenhouse-gas emissions by 18%, cellulosic ethanol could cut emissions by up to 88%.

The obstacle is that such "cellulosic" materials are hard to convert into fuels. It can be done, but with current technology it's much more expensive to produce "cellulosic ethanol" than corn-based ethanol. However, that unfortunate situation is now changing: billions of dollars of investment capital are flowing into biofuels in general, and in early 2007 the DOE awarded $385 million USD in contracts for the construction of six cellulosic-ethanol pilot plants that will produce almost 500 million liters of ethanol a year. That's only a few percent of US corn-based ethanol production, but backers of cellulosic ethanol believe the technology has finally reached the ignition point and is poised to take off.

* Production of cellulosic ethanol presents a formidable challenge. It's very easy to produce ethanol from cane sugars, and not particularly difficult to produce it from cornstarch -- though the starch has to be first broken down into simple sugars by an enzyme called "amylose", which is why sugar cane is a more efficient feedstock than corn. Once the raw sugars are available in solution, they can be fermented by yeasts to create ethanol, which is then distilled out of the solution. Cellulosic materials are a much harder nut to crack. They consist of three main components:

To turn cellulosic materials such as leaves, stalks, grasses, and trees, the plant fibers made up of cellulose and hemicellulose first have to be broken down into simple sugars. The lignin is simply a nuisance and has to be disposed of in some way. Breaking down the plant material is the first technical obstacle, but it's not the only one. Although cellulose breaks down into six-carbon sugars like glucose ready for fermentation into ethanol, hemicellulose breaks down into five-carbon sugars like xylose -- and there are no microorganisms that occur in nature that can metabolize five-carbon xylose into ethanol. There are some microorganisms that can metabolize xylose and similar sugars, but they don't produce ethanol as an end product.

The first steps toward production of cellulosic ethanol actually focused on fermenting xylose and other five-carbon sugars. In 1985, a team led by microbiologist Lonnie Ingram of the University of Florida at Gainesville genetically modified the human colon bacterium, Escherichia coli, to metabolize a wider range of sugars into ethanol. The modified bacterium proved able to convert 90% to 95% of the sugars in biomass to ethanol. The problem was that it could tolerate no more than 4% ethanol in the final fermenting solution; since distillation of ethanol from solution is energy-intensive, it pays to have the highest concentration of ethanol possible. Ingram and his colleagues have since boosted the tolerance of the bacterium to 6.4% ethanol, and have licensed their technology to businesses working on cellulosic-ethanol production.

It's not the only game in town. In 1995, researchers at the National Renewable Energy Laboratory (NREL) in Golden, Colorado, just outside Denver, genetically modified a bacterium named Zymomonas mobilis to ferment xylose and other five-carbon sugars, along with the six-carbon sugars the bacterium ferments naturally. This work was later passed on to researchers at DuPont in Wilmington, Delaware, with the latest Zymomonas strain capable of tolerating 10% ethanol. It is also now going into commercial service.

Work has been done to genetically modify yeasts, the standard microorganism for fermentation of traditional feedstocks, to handle five-carbon sugars. In 1993, a team of researchers at Purdue University in Indiana led by microbiologist Nancy Ho produced a modified yeast that could handle xylose, and have since honed the modified yeast to handle a wider range of five-carbon sugars, as well as increase its productivity.

* All that is encouraging, but more needs to be done. Yeast can process a batch of glucose to ethanol in a few hours, but the modified microorganisms can take a day or two to do the job, reducing production throughput. The modified microorganisms are also not very robust. Not only do they have problems dealing with even moderate concentrations of ethanol in solution, they also are jammed up by other by-products of the biomass fermentation problem. Improvements are needed.

Work is also being done on the first step in the cellulosic ethanol production process: breaking down the biomass for fermentation. The traditional approach is to break up the biomass with dilute acids and steam, then treat the resulting soup with cellulase and hemicellulase enzymes. This approach has limitations, mostly due to its use of acids:

A team lead by Bruce Dale, a chemical engineer at Michigan State University (MSU), has come up with an alternate scheme that uses ammonia and other basic substances in a low-temperature process that effectively breaks down leaves, grasses, and straws. The ammonia can be recovered and reused, and the process produced fewer enzyme inhibitors. The only problem is that it doesn't work so well with lignin-rich woody feedstocks such as trees, and so work continues on refining the process.

* There's also work on genetically modifying plants to be better fuel feedstocks. In 1999, a team under Vincent Chang, a biochemist at North Carolina State University (NCSU) in Raleigh announced the development of a modified poplar with 50% less lignin than a natural poplar. The work was originally intended to come up with a better feedstock for paper processing, but it's also relevant to use of poplars as a feedstock for ethanol production. The NCSU has not been able to improve on the 50% reduction, however, and has more recently been working on modifications of tree cellulose fibers to reduce their crystallinity and make them more vulnerable to cellulase enzymes -- reducing the need for enzymes. Researchers have also been tinkering with other feedstocks, such as switchgrass, to improve yield and reduce lignin content.

Right now, producing a gallon of cellulosic ethanol costs as much as $4 USD, almost four times as much as the cost of corn-based ethanol. Industry officials believe that once the first pilot plants come on line in 2009, that cost will have dropped to $2 USD a gallon, and foresee steady progress towards "break-even".

Backers of cellulosic ethanol production are perfectly aware of the limitations of corn-based ethanol, but appreciate the way it's paved the way. It has created an infrastructure for the production, distribution, and sale of ethanol that cellulosic ethanol is leveraging off of to get started. In a few decades, corn-based ethanol may be a thing of the past, but if so it will have served an honorable role in establishing the biofuel economy.



* NTD AGENDA: Billions are now being spent to fight disease in developing nations, but the funding has been heavily focused on AIDS, with a secondary focus on tuberculosis and malaria. However, a wide range of diseases that afflict the developing world have been more or less ignored, partly because they don't occur in any real way in developed countries and so the awareness there is less. According to an essay in the January 2007 issue of SCIENTIFIC AMERICAN by the prominent Jeffrey D. Sachs of the Earth Institute at Columbia University, efforts to control these "neglected tropical diseases (NTD)" would be relatively cheap and have a big payoff. The NTDs include:

Of these thirteen diseases, nine -- all the worm infections, leprosy, and trachoma -- can be dealt with cheaply and effectively through preventative or curative interventions. The incidence of dracunculiasis can be greatly reduced by filtering water through cheesecloth. Insecticide-treated bed nets can break the transmission of lymphatic filariasis and do much to block the transmission of malaria. Deworming agents are cheap and easy to administer, and in places where the worm infections are common, schoolchildren can be protected by a mere three treatments a year. A number of the "big pharma" companies -- including Merck, GlaxoSmithKline, Johnson & Johnson, Pfizer, Novartis, and Sanofi-Pasteur -- have signed up to contribute medicines and provide other assistance.

The US government has provided $15 million USD to fight NTDs, but it is estimated that $250 million USD a year is needed for just Africa alone. The NTD effort can dovetail with current, relatively high-priority efforts to control malaria, since some of the tools, for example bed nets, are useful to fight both malaria and some NTDs, and the infrastructure being built up to fight malaria can be used to fight NTDs. Besides, many children in Africa are "polyparasitized", suffering from malaria along with one or more NTDs, and it makes little sense to fight one but not the other.

A comprehensive global anti-NTD campaign will cost about $3 billion USD a year. This is not cheap but it is definitely within the means of the world's prosperous nations if they coordinate their efforts. The investment will also pay off by ensuring that the populations of undeveloped countries are healthy and productive. It will also mean hundreds of millions of healthier and happier children.



* MAKING BABIES: It's hard to think of any subject more prone to controversy than human reproduction, and an article in THE ECONOMIST ("Buying Babies, Bit By Bit", 23 December 2006), shows just how crazy it is when it becomes a business.

The slippery slope into this disorienting commerce began with "in-vitro fertilization (IVF)", or the technique of conceiving a baby in a petri dish. IVF was originally intended to allow people with certain types of reproductive system problems to still have kids, but now it's in demand from a wide clientele. That puts the operations trying to meet that demand into a scramble to find sperm, eggs, and wombs. It also means wading through an international thicket of laws, with procedures in the process jumping across borders as guided by favorable regulations.

As far as sperm donation goes, it is complicated by the issue of HIV-AIDS and by the potential for litigation over parentage. The solution to these issues is to freeze the sperm and then test it for HIV six months later -- it takes a bit of time for HIV contamination to become evident -- and to ensure a completely anonymous donor process -- with neither the donor nor the recipient being able to trace the link between the two, eliminating the possibility of claiming parental rights or being slapped with child support.

One of the world's biggest sperm banks is Cryos in Denmark, which has more than 200 donors on its books and sells 10,000 units of semen to fertility clinics in 50 countries each year, producing about a thousand pregnancies. Cryos cannot do business in countries where anonymous donation is illegal -- Sweden, Norway, the Netherlands, Britain, Switzerland, and Australia -- or where any sort of sperm donation is illegal -- Italy being the big example. There is, however, no real obstacle to conducting business with clients in these countries if they fly to Denmark for the deal.

Cryos has an American branch that conducts business online, offering $75,000 USD for exclusive rights to sperm from a donor judged worthy of selection. Cryos has stateside competition from California Cryobank, which pays $75 USD per specimen, along with occasional gift vouchers and movie tickets as bonuses. The profit margin is good, with California Cryobank getting $240 to $400 USD per specimen, depending on quality. Basic (anonymous) information on donors is available for free, but California Cryobank offers a wide range of additional help on a paid basis, including a complete profile and consulting services. A client can take options to reserve specimens from specific donors.

* Ova donation is a trickier, since it requires that the donor take "super-ovulation" drugs for about two weeks, with the ova then released for harvest. In the UK, it's illegal to pay ova donors, except for necessary expenses, and so in the past it was done essentially only among extended families or circles of friends. However, ova donation within the UK has all but dried up, since in the spring of 2006 the law changed to require that children born of donor sperm or ova to be told of the actual identity of their biological parents after turning 18. That of course grossly complicated matters and made the procedure too cumbersome for most to want to bother with. Not to worry, however, since Spain is extremely liberal on ova donation, mostly due to strong public sympathy for infertile women. A qualified donor can earn 800 pounds, and Spanish fertility clinics run partnerships with clinics in other countries that do the "groundwork" and then send the donors to Spain for ova harvesting. One London clinic claims it sends a donor to Madrid every week.

In principle, it's illegal to pay for donated ova in the US as well, but it is perfectly legal to "compensate" a donor for "time and trouble", with no legal ceiling on the compensation. US egg brokers will pay from $5,000 to $7,000 USD to donors. Classified ads, often placed in campus newspapers, also ask directly for ova, specifying the desired characteristics of the donor. In some cases, donors have received as much as $100,000 USD.

However, smart, beautiful, and athletic American women end up having something of a seller's market in the ova donation game, since such folk usually make plenty of money in more orthodox lines of work and see no need to engage in such intrusive and inconvenient ways of making money. It is more economical to source ova from qualified women in poorer countries. For example, a company named GlobalART obtains ova from Eastern European women for use in American and Israel. Sperm is obtained from US donors, with the IVF performed in Romania. The company charges $13,650 for the eggs obtained from a donor harvest. It is also possible to find websites set up by individual Russian egg donors online, though the hazard of fraud is obvious.

* Surrogate mothers are nothing new, the Old Testament referring to wives who couldn't bear their husbands sons and so told their husbands to impregnate their handmaidens. With IVF, the system is more sophisticated: now such women can have IVF performed with their ova and their husband's sperm, with the embryo implanted in a surrogate mother. Still, the transaction is fraught with potential difficulties, particularly from the fact that in the vast majority of countries the birth mother is the legal mother. In fact, surrogacy is illegal in most places, and in countries where it is legal, it may be constrained in major ways. For example, in the UK it's illegal to pay a surrogate mother, and the only way to get it done is through a clumsy scheme of social networking. Worse, in the end the surrogate mother can still insist on keeping the baby.

In the US, the legality of the matter is left up to the individual states. Some ban it outright; others rely on "case law", or the consideration of past judgements on the matter instead of laws on the books. California is easy on case law, with the legal system well set up to handle the matter, and the state has long been a good place to find surrogate mothers. Illinois has recently liberalized regulations on the matter and is catching up. However, surrogacy is troublesome even ignoring legal matters: it demands a lot of effort on the part of the surrogate, with a fee running easily to $20,000 USD and additional expenses dragging the bill up to $100,000 USD. It's not as cheap and easy as going to the handmaidens any more.

It is likely that improvements in reproductive medical technology will reduce the need for IVF in the future -- but there will always be those who want children and don't have the biological means of producing them on their own, and so it is unlikely that IVF is going to disappear any time soon.



* THE MAKING OF THE FITTEST (6): As mentioned in the last installment, all humans are mutants. However, this doesn't mean that we're all monstrosities or latent superhumans, since many of our mutations are either unnoticeable, or if they're noticeable, not important. This is because:

* For an example of the usefulness of mutations, consider a population of light-colored mice living in a sandy environment. If a mutation produces a dark-colored mouse, it will be at a severe disadvantage against predators and will likely die out quickly. Suppose lava flows then run across parts of the region. Light-colored mice will be at a disadvantage, while the dark-colored mice will have an advantage.

It won't take that long to get dark-colored mice, either. Careful lab studies of millions of mice show that a mutation occurs in about one of every half-billion sites -- letters -- of the mouse genome per mouse. (Since the mouse has about 5 billion sites, that's about 10 mutations.) A specific gene has, on the average, about 1,000 sites that can be mutated. That means that a particular specified gene will, on the average, undergo a mutation about once in every 500,000 mice. A gene named MC1R controls mouse fur color. About ten different mutations can change MC1R from specifying light fur color to dark fur color, and a mouse has two copies of the gene. That means that about 1 in 25 million mice with light fur color will obtain a gene to produce offspring with dark fur color.

That sounds almost as bad as trying to win the lottery, but the odds aren't as intimidating as they sound. Consider even a small population of 10,000 mice, with each female (half the population) bearing a litter of five mice a year. That means 25,000 babies a year, with a new generation every year. That means that a black mouse will arise, on the average, once in a thousand years. Kick up the population to 100,000, and the dark mutant will arise once a century, which is a blink of the eye on an evolutionary timescale. [TO BE CONTINUED]



* INFRASTRUCTURE -- COMMUNICATIONS (2): The telephone central office used to be a busy place, in the days when humans actually kept things going, but now all the switching is completely automated. The older offices are mostly empty except for equipment tucked off into a corner, with nobody around except to check up on it or repair it when needed. Sometimes space in the building is leased out. The central office also features banks of batteries to keep it running if the power goes down.

Central offices often feature bricked-up windows, a legacy of the Cold War in which they were "hardened" to withstand attack. High-priority centers were even built in underground bunkers. These bunkers are now often used for storage. In an "age of global terror", central offices still in use are protected by other measures -- for example, making them seem nondescript, not marking them with signs, and not telling anyone without a need to know where they are.

* The original telephone system was local, usually covering a neighborhood, possibly a city at the most. Long-distance messages were sent by telegraph. The problem was that the power output of a phone wasn't big enough to carry the signal any farther.

The solution was to build electronic amplifiers to boost the power of the signal at intervals. Vacuum-tube electronics were established about the time of the First World War, leading to the introduction of long distance calling after the conflict. Ultimately, analog telephone lines featured amplifiers spliced into the line about every 13 kilometers (8 miles) or so, though they were sometimes more tightly spaced.

Long-distance analog telephone lines are generally a thing of the past, however. Once a phone call reaches a central office, it is converted into a digital signal, which is more resistant to noise and easier to timeslice. It is then sent out over long-distance high-density communications links.

The first high-density link was coaxial cable, which appeared in the 1940s. It was originally used for high-grade broadcasting links, for example to connect a music studio to a nearby broadcast studio. However, by the 1950s the main high-density link was the microwave relay. Microwave towers now dot the landscape with their distinctive "horn" antennas. Their range is limited by interference from rain and the like, as well as the curvature of the Earth -- the maximum range is about 50 kilometers (30 miles).

Microwaves brought two things to the party: first, they didn't require stringing land lines from point to point, and second, they operated at high frequencies. This was important because the amount of data -- in this case, the number of voice channels -- that can be transmitted over a communications link is proportional to the range of frequencies, the bandwidth, of the link. The higher the frequency, the greater the bandwidth -- 1% of 300 megahertz is ten times more than 1% of 30 megahertz.

Microwave relays use multiplexing to cram more voice channels into a link. They not only use timeslicing, they also use "frequency multiplexing", sending multiple timesliced voice streams with each in a different part of the band. It's the same principle by which a broadcast radio receiver can pick up different radio stations over different parts of the AM or FM bands, with each broadcast at a different "carrier" frequency so they can share the same band without interference.

microwave links

* Microwave links persist as dedicated communication links, since they're cheap to implement, but for mass voice and data traffic they are effectively obsolete. Since the amount of voice traffic that can be carried over a link is proportional to the frequency of the link, there's a push to go to higher frequencies. The main problem is that at higher frequencies, it's harder to transmit signals through free space -- a light beam will be blocked by rain or dust.

The answer is fiber optics, in which light signals can be sent through a fiber of very pure glass -- so pure that kilometers of fiber have less attenuation of light than a typical windowpane. A single fiber can carry about 130,000 voice channels. Long-haul fiber-optic links have multiple fibers and can carry millions of voice channels.

Fiber-optic lines are now commonly carried on local telephone poles. They can be distinguished from ordinary copper phone lines because they have loops of cable at intervals to help with splicing. Since fiber optic threads are so transparent, splicing is tricky and delicate, since if it isn't done properly, it could introduce substantial attenuation in the link. Having some slack cable permits changes to be made with a single splice, not two of them. Optical-fiber lines are also often buried, particularly high-capacity long-haul lines. Since any one line carries so much traffic, a backhoe that cuts a fiber-optic line can knock out a substantial chunk of regional communications -- making protection of fiber-optic lines something of a national security issue.

* Fiber optics lines are also now the basis for international voice communications. Oceanic copper telegraph lines were widespread by the end of the 19th century; overseas telephone communications didn't arrive until the 1920s and 1930s, using long-range radio links. It was expensive and unreliable. The first transoceanic telephone lines didn't go into operation until the 1950s.

By the 1960s satellite communications was coming online, but due to the time delays of shooting a signal far up into space and then back down again, traditional comsat technology has always been at a disadvantage when it comes to voice communications. The first transcontinental fiber-optic lines went into service in the late 1980s, and now they handle international voice communications. [TO BE CONTINUED]



* RECYCLED PAPER: SCIENTIFIC AMERICAN's monthly "Working Knowledge" column for November 2006 described paper recycling, which like almost all industrial processes is a lot more technical than it sounds.

Discarded paper products are not ideal sources for making paper. Every time paper products are recycled, about a fifth of each sheet or piece is too broken-down to be used again and is discarded as waste, which means that after about five recyclings the original paper is completely gone. There is also the fact that used paper may be in different colors, will be printed with inks, and will be smudged or dirty. Other difficulties include coatings and "stickies", the weak glue on post-its being the example that comes most quickly to mind, but it's also present on stamps, labels, and the like. The glue is a real problem since it can gum up the industrial process.

The initial step in paper recycling is to sort the paper, by color or grade or whatever. This has been traditionally done by unskilled labor, which is expensive, but now automated sorting machines have been introduced that promise to reduce costs. Once sorted, the paper streams are fed into the recycling process in stages:

Not all paper requires all these steps. Fabricating cardboard for cereal boxes and the like can get by with fewer steps, and that's where recycled paper has the greatest cost advantage. There's a smaller cost savings for corrugated boxes and newsprint. Producing office paper from recycled paper requires such a high level of processing that there's no economic advantage to recycling, except for reducing the waste stream.



* SPACESHIP TROOPERS: The notion of using rocket transports to send troops to trouble spots around the world in a flash has been around since the beginning of the space age, but traditionally it's been strictly a blue-sky idea that the brass didn't take seriously. According to an article in POPULAR SCIENCE ("Semper Fly" by David Axe, January 2007), the US Marines are beginning to think that delivering troops by suborbital rocket is not merely possible, but even useful.

The scheme is being promoted by Roosevelt Lafontant of Schafer Corporation, a miltech consulting firm working with the Marines. Lafontant, a retired Marine lieutenant colonel, believes that a suborbital troop transport -- designated "Small Unit Space Transport And INsertion (SUSTAIN)" -- would not only be able to allow the Marines to send troops anywhere in the world on a moment's notice, it would also allow the US to intervene anywhere without obtaining permission to overfly foreign airspaces. By the Space Treaty, nations only have control up to 80 kilometers (50 miles) above the surface of the Earth, and SUSTAIN would fly above that altitude.

Lafontant became interested in the idea in the late 1990s, when he was still in the Marine Corps. He managed to sell it to USMC brass in 2002, and testified before Congress in 2003 to promote the idea. Some politicians laughed at the idea, but in October 2004, Burt Rutan's SpaceShip One became the first private vehicle to perform a suborbital space flight. Backers of SUSTAIN saw SpaceShip One as a demonstrator for SUSTAIN that showed the idea was practical.

SpaceShip One uses a jet-powered mothership to launch a small suborbital vehicle. SUSTAIN would be similar but larger, with the suborbital vehicle able to carry a squad of 13 Marines. The conceptually tricky part about the scheme is to figure out a way to recover the Marines after the mission. The SUSTAIN suborbital vehicle won't be able to fly back into space, and though retrieval wouldn't be as time-critical as insertion, it's still a difficult problem.

For the moment, SUSTAIN remains a paper plan, and few of the program's backers are pushing for a demonstration just yet, preferring to focus on enabling technologies. For example, the Air Force is working on an unmanned suborbital vehicle named the "Common Aerospace Vehicle (CAV)" for a range of missions, and SUSTAIN should be able to leverage off of CAV technology. Some critics claim that SUSTAIN is unrealistic, but its backers believe that if they can show all the relevant pieces are in place, they should be able to sell a demonstration project.



* MEDICAL ID THEFT: There is an old story about a bank robber, who was asked at his trial as to why he robbed banks. His answer was simple and straightforward: "Because that's where the money is." Given that any place where money accumulates, robbers are likely to accumulate as well, as reported in BUSINESS WEEK ("Diagnosis: Identity Theft" by Dean Foust, 8 January 2007), it's not surprising that the massive amounts of money being spent on health care in the USA are unsurprisingly drawing the attention of identity thieves.

In early 2004, Lind Weaver of Palm Coast, Florida, got a stiff bill from a hospital for the amputation of her right foot. There followed the predictable bureaucratic nightmare as she tried to convince the billing office of the hospital that there had been a mistake. She finally had to go to the office of the hospital administrator and put her feet up on his desk to show him that she hadn't been the one to obtain the amputation.

Somebody had managed to get hold of her address, Social Security number, and even her insurance ID number. If she felt inclined to shrug it off as a bad joke after the fact, in 2005 she found out it was no joke at all. When she was hospitalized for a hysterectomy, a nurse checked her chart and told her: "You have diabetes." That came as a surprise to Weaver, turning into a shock when she realized that her medical data was now mixed up with the fraudster's. She says: "I now live in fear that if something ever happened to me, I could get the wrong kind of medical treatment."

* Incidents of medical ID theft are now on the rise in the USA since it's so profitable: medical fraud can return orders of magnitude more financial reward than simple credit-card fraud. Given the temptations of that kind of money, not all ID theft is being conducted by patients after medical care. In some cases, the crooks are doctors trying to supplement their income by filing bogus claims. A Boston psychiatrist named Richard Skodnek was busted by the FBI in 1996 for 136 counts of fraud -- not merely billing Blue Cross for treatments patients had already covered, but also billing for treatments of relatives of his patients who had never come to his office, Skodnek having managed to obtain data on them as well.

Worse, in an increasing number of cases, the fraud is the work of organized crime rings. In 2005, California law enforcement busted such a ring operating in Milpitas. The scamsters set up a clinic and offered patients from a local retirement center a free checkup and a case of nutritional supplement. They got the relevant data from the patients, then billed insurance and Medicare for diagnostic tests that were never done, pulling in $900,000 USD in three months. Says a Florida state insurance fraud investigator: "Yesterday's drug dealers are now working in today's health-care fraud. It's more lucrative, and they don't face the same dangers they do in the narcotics trade." They don't have to worry about rival gangs coming over to shoot up their clinic; besides, if the law catches them, the penalties are lower.

Once patients have been compromised by medical ID theft, they can be in for a lot of trouble for a long time. In the US, financial ID theft, such as credit-card ripoffs, is covered by the FairCredit Reporting Act, which allows victims to review and correct their records; the same protections are not yet available for health records. Joe Ryan, a Denver native, placed an ad in a newspaper, and the clerk asked him for his Social Security number. He gave it -- and then got hit with a bill for over $41,000 USD in medical costs. When he tried to fix things with the hospital, staff told him he couldn't be Joe Ryan since his signature didn't match the one in their files, which had been provided by the scamster. Ryan did finally manage to get the hospital to drop the bill, but his credit was ruined and insurers upped his rates.

Anybody who's dealt with hospitals knows that they can sometimes be very bureaucratic and inept; to no surprise, law enforcement officials say that the security procedures of many health-care organizations are almost criminally lax. Many of the health-care frauds are inside jobs, with clerks passing on patient records to accomplices. Even simply selling off patient records can be profitable, since they have a street value of $50 USD or more.

Advocates of a national digital medical record system assert that once a universal system is in place, security features and software to identify anomalous patterns of transactions should cut down the incidence of health-care fraud. That is likely be true over the long run, but it is hard to design an elaborate software system to be secure from the outset, with crooks very quick to find and exploit vulnerabilities not foreseen by system designers. In the short term, moving to a digital medical record system may permit scams to be performed on an entirely new level of scale. For the moment, at least some medical facilities are insisting that patients provide picture ID for their files; once the patients understand why, they usually do so without objection.



* THE MAKING OF THE FITTEST (5): The peppered moth is one of the most famous (or, to some, infamous) example of evolutionary change, but field biologists have detailed other examples. California field biologists performed an experiment to observe the effect of small changes in pigeon appearance on the bird's vulnerability to attacks by peregrine falcons, one of the pigeon's most dangerous predators.

The pigeons in the study had six different plumage schemes, with seven years of study showing that pigeons with a plumage scheme featuring a white patch on the rump were less vulnerable, by a surprisingly large margin, to falcon attacks -- it appeared that showing the white patch confused the falcon while the pigeon was taking evasive action. To prove the linkage between the white patch and a higher survival rate, the researchers caught pigeons, switched the rump feathers between white-rumped and non-white-rumped pigeons (clipping off the features and fixing them back on with latex glue), and observed that the falcons no longer had any particular trouble with pigeons that had once had white rump feathers, but then couldn't catch the "fakes" with white rump feathers. The study showed that the pigeons with white rump feathers were gradually predominating the population.

* Another interesting example is the three-spine stickleback fish. During the ice ages, ocean-living sticklebacks invaded fresh water lakes and streams, only to become reproductively isolated after the end of the ice ages. While oceanic sticklebacks have a protective row of more than 30 bony plates running up their sides, the freshwater sticklebacks apparently don't need them -- and so have no more than nine, sometimes no, armor plates on each side.

After a chemical eradication program killed off the stickleback population of Loberg Lake in Alaska, in 1982 the lake was recolonized by oceanic sticklebacks. From 1990 to 2001, regular sampling showed that the frequency of the oceanic form of the stickleback went from 100% to 11%, with relatively unarmored sticklebacks making up the rest of the population.

* Why did the lake sticklebacks "decide" to start getting rid of their armor plates? They didn't -- a random mutation simply got rid of them, and the "lucky winners" with the mutation outbred those who didn't have the mutation. It's the same process as created the icefish. A random mutation that would have been harmful to a normal fish -- causing the loss of the ability to make hemoglobin -- was a positive benefit to the icefish. That particular mutation was only part of a cumulative series of mutations that gave the icefish a bigger heart, bigger gills, a heftier circulatory system, with each change arising one at a time, being screened out by the "trial and error" process of natural selection.

Mutations arise in the duplication of a life-form's genetic code, its DNA. DNA replication is a common process and sometimes it goes wrong. The most common mutation is to simply get one of the "letters" of the code -- A, C, G, or T -- wrong. However, that's not all there is to the matter, since there may be mutations that cause deletion or insertion of entire blocks of letters, or in some cases duplications of blocks of letters. Mutations are happening all the time. Each one of us has about 175 mutations in the 7 billion letters in our genetic code. Every human is a mutant. [TO BE CONTINUED]



* INFRASTRUCTURE -- COMMUNICATIONS (1): Chapter 7 of Brian Hayes' book INFRASTRUCTURE covers electronic communications technologies. Of course the classic electronic communications system still in use is the telephone.

Alexander Graham Bell famously invented the telephone in 1876. In the old days, a battery powered the telephone's microphone and speaker, while a little generator crank activated the ringer on the other end of the line. All the connections were point-to-point: a phone was connected to a switchboard, with the caller asking the operator to connect to a specific number, also on the switchboard. The operator plugged a cable in to make the connection. This was necessarily a local connection, but longer-range connections could be made by routing through multiple switchboards.

Just before World War II, electromechanical switching systems were introduced, with a dial on the phone being spun to move relays, or "Strowger switches", in the switching center -- the number of movements being determined by the pulses produced by the rotary dialer. The scheme was updated to a fully electronic system in the 1960s, which led to the introduction of the "touch tone" phone, which produced combinations of two tones to tell the switching system how to make the connection -- a scheme known as "dual tone multiple frequency (DTMF)". It is still possible to buy dial phones as a nostalgic novelty and they still work.

classic touch-tone phone

The modern "land line" phone system (as opposed to wireless phones) still remains generally analog at the home end, with the home phone attached to the "local office" by a pair of copper wires, the "local loop". Once upon a time, the two lines for each local loop were strung individually on telephone poles, but though such "open wires" still persist in a dying fashion in some rural areas, in general now local loops are bundled together in a single cable, with the wires protected from mutual interference by plastic insulation -- and wound together in a spiral fashion so that two loops don't run parallel to each other, resulting in "crosstalk".

The local loops are organized in basic bundles of 25 loops, with up to 24 bundles forming a "superbundle" of 600 loops. Superbundles may be consolidated in cables with up to thousands of loops. The wires are color-coded with an elaborate striped pattern to allow them to be distinguished. The multipair cables are either routed underground or carried on telephone poles. On poles, the multipair cables are strung below steel "messenger cables" that support the weight. Such cables also have bigger cylindrical "splice cases" at intervals to allow two segments to be mated. Underground lines will have "splice cases" poking up out of the ground at intervals.

The local loop scheme is obviously cumbersome, and it's also wasteful. A single loop is effectively idle until its user makes a call. It would be nice if one local loop could carry more than one conversation, and in fact this can be done, using a box called a "concentrator", "multiplexer", or just "mux". There are different ways to multiplex telephone conversations -- the most logically straightforward is "timeslicing", to sample each conversation at intervals and send the sample values for the conversations interleaved in a cycle. The samples can then be resorted to produce the original voice signal. Obviously the sampling has to be well above the top voice frequency -- about 3,000 Hz for an ordinary analog phone -- and in fact the absolute minimum sampling rate is twice the top voice frequency, or 6,000 Hz. In any case, the mux takes 30 or so lines and timeslices them into a single line for transmission to the central office.

* One of the interesting aspects of a telephone is the numbering system. In the USA, a phone number consists of three digits -- a 3-digit area code, a 3-digit central office code, and a 4-digit local code. That sounds in itself like 10,000,000,000 codes, which would be plenty for the entire USA for a long time to come, but it's not that simple.

The first problem is that an area code can't start with 0 or 1. This goes back to the Strowger switch, in which the 0 immediately connected the user to the operator (the 1 is reserved for other reasons not mentioned). That immediately cuts the number of available numbers to 80%.

When "direct dialing" was introduced in 1947 -- before that, a long-distance call had to be made through an operator -- North America was cut up into 86 area codes, with 50 more codes in reserve. That gave about 700,000,000 codes, which was thought to be enough to last for centuries. Alas, cellphones, pagers, faxes, and computer modems intervened, and then the phone company was broken up, with each operator demanding a block of numbers.

In the late 1980s, the system was adjusted to permit 800 possible area codes, though some like 911 were reserved, with 650 left over. 350 are in use, but many of the rest are reserved, and only 29 remain free. Over the years, area code regions have fragmented to handle growth. It seemed likely at one time that the USA would run out of phone numbers very quickly, but conservation efforts -- for example, telling phone companies they can't stockpile numbers they're not planning to use -- have pushed the date out a few decades. [TO BE CONTINUED]



* PATENT WARS REVISITED: Back in January 2006, an article in these pages took a look at "junk patents". Now an article in IEEE SPECTRUM ("Patently Obvious" by Kirk Teska, December 2006) says that the courts are about to consider a change in patent law that could potentially have a huge impact.

Traditionally, to be patentable an invention has to be "new" and "unobvious". The "new" element hasn't been troublesome in the past, but the "unobvious" element has led to difficulty, and that's what the Supreme Court is now considering. It isn't a trivial question either, with concerned parties from Microsoft, a team of legal academics, and the Federal government involving in considerable effort to make their concerns over the matter known to the court. While the US Patent Office has examiners with high qualifications for passing such judgements, once they've done so the judgements can be challenged in Federal court -- and some claim that the current standard in the courts for "unobviousness" is much too low, resulting in the corruption of patent law with a flood of "junk patents" and nitpicking lawsuits.

A few interesting examples shed light on the issue. The first is the "Furminator" pet-grooming tool, US Patent #6782846 and #7077076. The patent flatly admits that all the Furminator amounts to is the blade taken from a Sunbeam electric hair trimmer with a handle attached. Pet owners have long used the hair trimmer blade to groom pets, but Furminator believes that adding a handle results in a patentable product, and is now suing three companies for patent infringement.

The second example is a bit trickier. Image scanners have traditionally used their own power supply and microcontroller, but Syscan had a better idea, embodied in Patent #6275309, to borrow power and control from a PC through a PCMCIA plug-in card. Syscan also obtained a patent, #6459506, describing a scanner powered and controlled by a PC over a USB serial interface. Syscan didn't invent the PCMCIA card, USB, or scanners, and the company's patents only specify the combination of PCMCIA or USB with a scanner. Syscan now has lawsuits in progress against five scanner companies.

Most people looking at these examples would consider them obvious and nonpatentable, but it must be remembered that many patents cover systems made up of components and assemblies invented by others, and there's nothing ridiculous about patenting a combination of known inventions in principle. In 1999, a patent was awarded to the inventor of a plastic lawn / leaf bag marked to look like a Halloween jack o'lantern, an item which now proliferates on American lawns in the fall. While the invention is trivial, it also took a certain clever stroke of imagination, and a good case can be make that it was worthy of a patent. However, in 2000, the Federal Circuit Court of Appeals allowed a patent on the use of Bag Balm -- a medicated petroleum jelly used normally to treat raw cow udders -- to treat human baldness. The logic was that curing baldness had never been any part of the intent for which Bag Balm had been invented or sold and so the alternate use was "unobvious".

The sticky point on "unobviousness" is that the court requires citation of some prior "suggestion, teaching, or motivation" to combine the components in order to hold the combination obvious. This leads to the current case, which was discussed in the article run here in January: Teleflex of Limerick, Pennsylvania, is suing KSR International of Ridgetown, Ontario over a patent on an adjustable gas pedal for electronic throttle controls. KSR holds that the combination of features, all of which are individually well-established, is "obvious" and non-patentable. More significantly, KSR is challenging the standard for "obvious", saying that "prior citation" is asking too much, instead proposing that the courts should determine if anyone with a working knowledge of the technology would find a given combination of existing components "obvious".

There's much resting on the outcome. If the courts stay with the existing standard for "unobviousness", then inventors will continue to struggle with patent infringements. However, if the courts go with KSR, they will unleash a flood of litigation over existing patents. Either way, the courts have a headache on their hands.

* ED: Later reports indicate that on 30 April 2007, the court unanimously judged in favor of KSR -- it appears the judiciary thinks the "unobvious" concept has been taken too far. Predictably, companies quickly came forward to request that suits against them for patent infringement be dismissed.



* LOW COST WHEELS: One of the odd things about the world, once one thinks of it, is why computers and other consumer electronics keep on getting cheaper, while cars get more expensive. According to BUSINESS WEEK ("The Race To Build Really Cheap Cars" by Gail Edmondson, 23 April 2007), some in the automotive industry are wondering the same thing.

The reason for this change in attitude is the emerging market for cars in developing nations. The automobile market in developed countries is stagnant and not all that profitable, while citizens of rapidly developing nations now have the money to upgrade from a bicycle or scooter. Tata Motors of India, part of the huge Tata industrial empire, is planning to introduce a car for only $2,500 USD in late 2008, and Renault-Nissan is planning to compete on that basis.

Renault-Nissan is already halfway there, having introduced the bare-bones Logan sedan in 2004 for a pricetag of only $7,200 USD, about 40% the price of the competition. Almost half a million have been sold in 51 countries. with plants in Romania and Russia churning out the cars round-the-clock, and still not able to meet demand. Renault-Nissan thinks that the company can now move on to the next lower price bracket, though company CEO Carlos Ghosn says that it takes a different mindset to build such a vehicle, one that developed-world carmakers generally lack: "The people who have these skills are in India and China."

Tata's low-cost car is a demonstration of that assertion. When Tata announced work on a $2,500 USD car, Western automotive executives laughed, thinking it was going to be another Yugo, or worse a Trabi, but once they saw what Tata was doing they stopped laughing so hard. The vehicle will win no prizes for anything but price, but it's not a toy, with four doors, a 25 kW (33 HP) engine, and a top speed of 128 KPH (80 MPH). Company officials say it will even pass a crash test.

Manufacturers in developing countries are used to building low-cost products. Says Tata Group Chairman Ratan N. Tata: "You have to cut costs on everything -- seats, materials, components, the whole package." Renault-Nissan understood this perfectly well with the development of the budget-rate Logan. The car was designed with a low parts count and to be very easy to assemble. Expensive electronics were deleted, a simple flat windshield was used instead of a curved windshield, and the side mirrors were designed symmetrically so that the same mirror assembly could be used on both sides of the vehicle. The car was developed in a sophisticated computer-aided design system that reduced the need for prototyping and made implementation of production tooling straightforward.

Renault-Nissan Logan

As this example shows, a low-cost car is not necessarily a low-tech car, with either simple technology or leading-edge technology used as needed to ensure the lowest cost. Renault is leveraging the efficient processes set up for designing the Logan into the development of higher-end vehicles, ensuring a higher profit margin.

Most major auto manufacturers have an eye on the low-cost market. Toyota, VW, Fiat, and Peugot have announced plans for such vehicles, and Hyundai is setting up a factory in India to build cheap cars for global sales. GM is working with Daewoo of Korea and Chrysler is working with Chery of China on low-cost designs as well. The attraction is the huge, untapped market. The downside is a squeeze on profit margins that will be very painful for traditional car makers, who have long preferred to sell fancy cars with a fat markup.

The problem is that the car makers don't have much of an alternative, since low-cost cars also threaten domestic markets -- much to Renault-Nissan's surprise, the Logan is a big seller in Western Europe. To be sure, a car can't be sold in, say, the USA for the same low price as it sells for in India, since a US car has to meet higher standards: it needs emission controls, airbags, and antilock brakes to meet regulations.

However, even with all the added features a low-cost car will still be much cheaper than the competition, and if the traditional car makers don't fill the need, foreign competitors like Tata will. The traditional car makers will be hard-pressed to get a tenth of the profit they're used to out of a low-cost car, but pushy newcomers like Tata won't see the low margin as a problem. In fact, they will be delighted to get a fingerhold into a market that will allow them to sell more expensive models. The traditional players will have to adjust, or be shoved out of the game.



* BOTTOM-UP TORT REFORM: In the 1980s, the US developed something of a passion for torts -- civil lawsuits -- with private citizens and public groups taking on big business and often winning, with the result that it seemed that no action could be taken by any organization without risking lawsuits. Even if the lawsuits had no merit and had little chance of winning, they could still be a drain to fight. In the 1990s, pressures for tort reform began to rise. Given the clear abuses of the tort system, it seemed that reform had to happen, particularly as the American political environment became more conservative, but with so many vested interests involved, nothing seemed to change.

According to a BUSINESS WEEK article ("How Business Trounced The Trial Lawyers" by Michael Orey, 8 January 2007), the gridlock tying up tort reform may still be in effect in Washington DC, but at the state level the times are clearly changing. For example, it is all but impossible to press a lawsuit on a drug manufacturer in Michigan; six states have placed strong restrictions on asbestos suits, and 23 states do not permit customers of, say, McDonalds, to sue because they ate too much fast food and got fat. It is no longer easy to file medical malpractice suits in many states, and the legal environment against class-action suits, once a common tactic for unscrupulous law firms to extort money out of corporations, is now so hostile that the industry journal AMERICAN LAWYER recently claimed that the era of mass class-action suits was at an end.

State courts have backed up state legislatures. In 2005, the Illinois Supreme Court struck down a $10.1 billion USD judgement against cigarette-maker Philip Morris for deceptive advertising of "light" cigarettes. State courts have even been willing to take action more directly, with the Alabama Supreme Court reversing 27 of 31 decisions originally in favor of plaintiffs in the 2004:2005 judicial session. In addition, although the US Congress has been of little help in the battle for tort reform so far, the Federal court system has been backing up state decisions that limit the rights of plaintiffs, and has also passed judgements on procedural issues that help limit abuse of the trial system -- for example, restricting the influence of professional "expert witnesses" with dubious qualifications who make a career out of going from court to court, supporting the cases of plaintiffs.

Up until recently, the plaintiff's bar had been going from success to success -- nailing Ford and Firestone for defective tires, working with state attorneys to obtain judgements of hundreds of billions of dollars from cigarette companies, and helping bring down the leadership of Enron. Now a prominent law firm that was once very active in filing shareholder lawsuits against companies for supposed stock manipulation is under indictment; thousands of asbestos and silicosis claims are being investigated for fraud; and the tidal wave of Vioxx litigation -- the drug having developed an immediate link in the public mind with the word "lawsuit" -- is now going back out to sea. In a dozen cases that have reached a verdict over Vioxx, Merck INC has now won eight of them.

* Ironically, although the state of Texas is regarded with good reason as a stronghold of conservatism, it was long one of the most liberal states as far as the plaintiff's bar was concerned. Texas lawyers were pioneers in asbestos lawsuits, and also all but invented "forum shopping", or the game of finding courts that were the most likely to favor the plaintiffs in a particular case. Some Texas lawyers became wildly rich; Houston attorney John M. O'Quinn, who pressed cases over tobacco, breast implants, and diet drugs, acquired a classic car collection valued at $100 million USD.

The lawyers were careful to pump money into the election of friendly state legislators and judges. Business had been reluctant to get into politics, and when it did, the focus was at the Federal level, where ironically the inertia against reform turned out to be greatest. It was prominent Republican organizer Karl Rove who turned things around in Texas. The Left may not be able to stand the sight of Rove, but even they have to admit that he's a master political strategist, and the man who help make George W. Bush the governor of Texas also spent time helping convert the Texas Supreme Court from 100% Democrat to 100% Republican. Rove understood that the greatest payoff would be at the state level, and industry groups like the US Chamber of Commerce and the American Tort Reform Association also saw the light. A state lobbying group, the Texans for Legal Reform (TLR), became a powerhouse in changing the legal landscape of Texas.

Beginning in 1995 and reaching a crest with a rush of new laws in 2003, Texas turned about from a plaintiff's paradise to an environment where Hugh Rice Kelly, a TLR official, could tell the BUSINESS WEEK reporter with a perfectly justified bit of smugness: "We have covered most of the things we have wanted to have corrected." For example:

The Texas Supreme Court is also no longer very sympathetic to plaintiffs in appeals. In 67 appeals considered during the 2005:2006 judicial term, the court judged against the plaintiff in 57 cases. The very strong bias against the plaintiff has made it more difficult for plaintiffs to get a settlement out of court, since the defendants simply shrug and say: "We'll beat you on appeal."

On top of the legislative offensive, the TLR and its allies also conducted a public-relations campaign to successfully convince most Texans that the lawyers were simply greedy extortionists -- it's easy to suspect a video tour of O'Quinn's car collection was helpful in this exercise. Once upon a time, lawyers would be glad to get a jury trial, since they could sway the jurors with tales of woe concerning the plaintiffs, but now the public is so generally hostile to the lawyers that jury trials have become undesireable. In a recent negligence case, a plaintiff's lawyer reported that most of the prospective jurors had to be dismissed when they said they would not contemplate punitive damages, or damages of more than a million dollars.

Texas law firms are now scaling back, focusing more on commercial suits and small cases whose merits are obvious. The reduction of flow of money to the lawyers is all for the good as far as TLR and their allies go, since it means less money to influence the government. The tort reformers aren't satisfied with what they have now, either, since the level of reform varies from state to state. No doubt the Federal government will have to become involved sooner or later to try to set more uniform standards. It might seem to be an inefficient way to create Federal standards by lobbying the governments of the 50 individual states and hoping things will bubble up eventually, but nobody ever claimed laws were created in an efficient fashion.