* Entries include: JFK assassination (series), India development (series), shoplifting as global issue, radio astronomers against radio noise, finding storage for genomic data, diamond for quantum computing, diminishing returns for supercomputing, green funerals, Juno Jupiter orbiter, intestinal flora versus diet, nuclear cleanup at Hanford, and dubious economics of nuclear power.
* NEWS COMMENTARY FOR MAY 2011: All the public excitement over the killing of Osama bin Laden in the last month tended to obscure the fact the democratic revolutions that have swept the Islamic world had reduced him to near-irrelevance. The ultimate fate of the revolutions in progress is uncertain, and an amount of "headline weariness" has set in, shoving the news off the front pages. However, things are still happening.
For a time, it seemed as though the revolt against Libyan dictator Muammar Qaddafi was likely to be crushed -- but, as reported by THE ECONOMIST, events are now demonstrating that, as is of the case with insurgencies, time is on the side of the insurgents. Even if they can't outright win, if they can just hold out and keep up the pressure, the old regime's position will just keep getting weaker and weaker. If the rebels don't lose, they win; if the regime doesn't win, it loses.
The rebels are making some gains, supported by NATO airstrikes that crush Qaddafi's heavy weaponry. The rebels have become less amateurish, now featuring a clear chain of command controlled by military professionals and with better communications with NATO. The rebels have also established a credible interim government with qualified leadership, and have good relations with foreign backers.
In the meantime, Qaddafi is running out of fuel, of food, of ready cash. There have been more high-profile defections from his government, while the International Criminal Court has issued warrants on several of his senior officials for war crimes. Some outsiders are talking about a ceasefire and a gracious exit for Qaddafi, but rebel leadership is not so enthusiastic about the idea -- for good reasons, since if anything has characterized Muammar Qaddafi it is his staying power. He isn't the sort of man to go quietly off into the dark night; it's a better bet that he will fight to the bitter end.
* As reported by an article from BUSINESS WEEK ("Tax Lab", 11 April 2011), thanks to monster budget deficits, the US government is now examining changes to America's hideously complicated tax law. There is a broad consensus for tax reform; the problem is that there's considerable disagreement on how to go about it, with competing ideas fighting for dominance, such as "Fair Tax", "Tax Amnesty", "Value-Added Tax", and "Flat Tax". Each idea seems to have advantages, along with some drawbacks:
Western European nations haven't been happy about seeing nations like Slovakia siphon off investment funds and have been pushing for tax harmonization across the European Union. Such feuding is of no direct concern for America, it's a matter the Europeans will necessarily work out among themselves; the question for Americans is whether the examples of Eastern Europe suggest Flat Tax is a good idea for the USA as well. Whatever options America settles on for tax reform, there's no question that something has to be done, and whatever is done, it's not going to fix the problem right away.
* The big news for the month was the end of the world on 21 May. The flap over the prediction of doom was of course in large part media sensationalism of the marginal, since much the same story gets played every now and then. As was pointed out, if an infinite number of monkeys had an infinite number of sacred texts, they would predict that the world would end an infinite number of times for every day in the history of the Earth.COMMENT ON ARTICLE
* CRIME SPREE: A note in THE ECONOMIST says that India has the dubious distinction of topping the charts in pilferage. A study performed by the UK Centre for Retail Research covering 42 countries shows that Indian retailers lose goods worth 2.72% of sales from theft. Taiwan was had the smallest losses, only 0.87%. Globally, the cost of retail theft is about 1.36%, with losses equivalent to hundreds of billions of dollars.
The obvious reaction is to judge Indians as unusually prone to thievery, but the authors of the report say that's not the issue -- it's weak shop security. 15 years ago, Taiwan suffered heavily from shoplifting, but then Taiwanese retailers decided to acquire the technology and procedures to clamp down on thieves. Retail security is likely to become a boom business in India as well.
In the USA, crooked employees are a worse problem than thieving customers; in Europe, it's the other way around. Employees, since they're more or less trusted and know the security systems, do tend to find it easier to steal than customers, and employees tend to steal more valuable merchandise. However, shoplifters can still be a real threat, in some cases working as members of well-organized gangs that target specific products and have efficient channels for profitably disposing of them.
One reason that shoplifting is attractive is because the authorities don't pay too much attention to it. Even in the UK, where law enforcement is strict, no more than about 10,000 of the 450,000 shoplifters busted each year go to jail, and then for an average of about two months. The end result is that retailers have to deal with the problem on their own. They put security tags on their products, install tag readers and surveillance cameras, train their employees to spot thieves, and run potential hires through screening software to check for criminal records.
It's an expense, but security measures only cost 0.34% of sales, they are still a real bargain compared to the kind of losses suffered by Indian retailers -- which is why there is little doubt that security in India is likely to improve, it's much cheaper than the alternative. Security tech keeps getting cheaper and better, too, but then again thieves are always coming up with new clever tricks. The security industry has a guaranteed business, and it's not likely to go bust any time soon.COMMENT ON ARTICLE
* THE KILLING OF JFK -- THE BALANCE OF EVIDENCE (18): Having discussed the wounds and bullets, the next topic is to consider whether Oswald could have scored hits on JFK and Connally the way the Warren Commission claimed he did. The first part of this issue is the Carcano rifle itself. Conspiracy theorists often insist that the Carcano rifle was a notoriously bad weapon, so inaccurate that it was known as the "humanitarian weapon" for its inability to hurt anyone.
In practice, it is maybe not impossible but certainly difficult to read through firearms encyclopedias in a library and find negative comments about the Carcano. It is easy to learn about infamously bad weapons, like the French Chauchaut machine gun, said to make a fair club at best; or about the idiosyncrasies of some well-known weapons like the cheap-&-dirty British Sten submachine gun, which had a tendency to burst into automatic fire spontaneously if carelessly dropped, and the problems that notoriously afflicted the M-16 assault rifle in Vietnam. Entries about the Carcano are bland. Some sources state that Finland obtained a batch of Carcanos and Finnish troops using them against the Soviets were unhappy with them, but primarily because it was hard to obtain ammunition for the weapons.
A detailed history of the Carcano series by a firearms enthusiast named John Sheehan was published in GUNS magazine in August 2007, with the title: "Italy's Mannlicher-Carcano: How Did Such A Good Rifle Get Such A Bad Reputation?" The article detailed the history, variants, and production of the weapon, making no mention of the JFK assassination; it provided a generally positive evaluation of the weapon and its service career. Several Carcano owners corresponding with the magazine also said that the Carcano was unfairly maligned. The most negative comment criticized the workmanship of the weapon, saying it was inferior to other widely-used rifles of the World Wars, but still characterized it as a "great lightweight carry gun."
A video available on YouTube demonstrated a shooter scoring hits on a 60 centimeter (2 foot) target from a range of 600 meters (650 yards) -- not every shot was a hit, but he still scored hits. The same video showed the shooter firing six shots, with a Carcano fitted with a scope, in less than six seconds. When asked if the shots were hits, the author replied: "All hits on a 10-by-36-inch metal plate at 120 yards." A commenter sniped at the author, saying he had "accurized" the weapon, and couldn't do so well against a moving target. The author replied:
What are you claiming that I've done to this rifle? I assure you that I have no problem whatsoever striking a moving target -- would you like to make some sort of wager on this? And as far as accuracy, it is usually the person holding the weapon that determines how accurate it can be fired.
The stories about what a wretched weapon the Carcano is are common among conspiracy theorists, but the exception among shooters. Oswald's Carcano was evaluated by the US Army Infantry Weapons Evaluation Branch at the request of the Warren Commission, with the boss of the evaluation organization, Ronald Simmons, testifying that his people fired it 47 times and it proved "quite accurate" -- the accuracy being comparable to that of the M-14 rifle, the standard US Army weapon of the time.
Conspiracy theorists objected that the Army shooters were all "experts", but that's irrelevant since the issue is, at least for the moment, whether the Carcano rifle was an ineffective weapon, and the tests clearly showed that it was accurate; besides, the expertise of the Army shooters meant they were well-qualified to judge the accuracy of the weapon. Conspiracy theorists have also claimed that Oswald's Carcano was in terrible condition, but though it was as clearly weatherbeaten and banged-up as one might expect a weapon over 20 years old to be, again the Army shooters had no great trouble with it -- reporting minor difficulties, such as initial fumbling with the bolt action and some unfamiliarity with the trigger pull. [TO BE CONTINUED]START | PREV | NEXT | COMMENT ON ARTICLE
* GIMMICKS & GADGETS: As reported by THE ECONOMIST, a company in Orlando, California named Planar Energy -- established in 2007 as a spinoff from the US National Renewable Energy Laboratory (NREL) -- is now beginning pilot production of a lithium-ion battery that can be printed onto substrates such as metal or plastic.
A battery consists of metallic anode and cathode electrode plates, separated by a conductive electrolyte material. Typically, electrolytes are liquids or gels, but Planar has developed a solid electrolyte material. In the Planar battery, the anode consists of doped tin oxide with lithium alloys; the cathode consists of lithium manganese dioxide; and the solid electrolyte consists of materials called "thio-lithium superionic conductors" AKA "thio-LISICONS". Each layer can be simply sprayed onto a continuous moving sheet of substrate material, with the option of including other thin-film devices as part of the process. Planar claims that in maturity their batteries will cost a third as much as conventional lithium-ion batteries, have half the weight for the same storage capacity, and be good for tens of thousands of recharging cycles. Wait and see.
* In related news, THE ECONOMIST reported on the work of Paul Braun and his colleagues at the University of Illinois in Urbana-Champaign to develop a battery that charges much faster than current batteries. One way to increase the charge-discharge rate is to increase the surface area of the electrode plates -- but how can that be done without building a much bigger battery?
The researchers started with a matrix of polystyrene spheres each about a millionth of a meter in diameter, to then use electrodeposition to fill the spaces between the spheres with nickel. That done, the spheres were melted away and the hollows enlarged by chemical machining, resulting in a spongy structure with a tremendous amount of surface area. To construct a battery electrode, the structure is plated with an appropriate electrode material -- nickel oxyhydroxide for nickel-metal hydride batteries, manganese dioxide for lithium-ion batteries.
A battery fabricated with such technology can charge one to two orders of magnitude faster than current technology, meaning a full recharge in minutes. The new scheme seems suited to mass production and should have a modest cost premium of no more than 30%. However, it comes with a catch: fast recharging means hefty currents, and that means charging gear to provide the amps that fast, which isn't trivial to do.
* WIRED Online had a note on a little portable LED lamp, the "Mantis", from a design shop named "Quirky". It's shaped a bit like a space torpedo, with 11 bluish LEDs running down its length and power from twin AA batteries. It's slick; it can be stood up on two pop-out legs to cast light over a book or work surface, or clipped onto a clamp-anywhere mount.
The interesting thing about the Mantis is that it's a "crowdsourced" product. The group working the organization accepts suggestions from whoever, sorts them out, and comes up with a design. The design is then announced to the world, and if enough people lay down their money, it's put into production. The Mantis costs $30 USD, not bad for a specialty item, and requires 1,500 orders for production. A crazy business model? Maybe so, but in an era of internetworking and an emerging era of rapid flexible production, it may be craziness with a future.COMMENT ON ARTICLE
* QUIET PLEASE: As reported by an article from AAAS SCIENCE ("Radio Astronomers Take Arms Against A Sea Of Signals" by Yudhijit Bhattarcharjee, 22 October 2010), radio gear isn't very welcome in the region surrounding the Green Bank Telescope (GBT), a 100 meter (328 foot) wide fully steerable radio telescope operated by the US National Radio Astronomy Observatory (NRAO) in Green Bank, West Virginia. The GBT is shielded from radio emissions from the surrounding world by the Appalachian Mountains; to prevent pollution by radio emitters within that barrier, in 1958 the US Congress declared a 34,000 square kilometer (13,100 square mile) region around the NRAO site as a "radio free zone", banning all transmitters that could interfere with observations. Cellphones don't work there.
Unfortunately, the massive proliferation of wireless devices is making the radio silence harder to maintain. The recent introduction of US digital TV broadcasts, which are noisier than the old analog TV broadcasts, didn't help any. Traditional nuisances such as aircraft and spacecraft transmissions contribute to the chaos. Intrusive emissions can wipe out several hours of observations at a time; five years ago, an entire night of GBT observations was wiped out because an NRAO engineer had installed a wi-fi modem in his house. The problem is getting worse as radio telescopes become more sensitive.
Radio astronomers around the world are struggling to deal with the problem. Tools include electronics to help radio telescopes tolerate power surges that can disrupt or even destroy receiver gear; filters to block out noisy radio bands; techniques for rejecting signals from "sidelobes" off the telescope's boresight; schemes to subtract noise from data already collected; and community action efforts to get neighbors to play along.
* Problems begin with the amplifiers used to boost faint signals received from space. Anyone playing with a stereo system knows that if a stereo is turned up to play soft music, then when loud music comes on it can overload the speaker system, producing a roar of noise. More or less the same thing happens with amplified radio signals, with sensitive amplifiers straining to boost faint signals suddenly overloaded by a strong signal, effectively destroying the observation.
To guard against such "nonlinear" response, amplifiers are being designed with much wider amplitude range or "headroom". That way, even if a strong signal comes in on top of a faint signal, subtracting the strong signal from the observation data will reveal the faint signal. New radio telescopes such as the "Low Frequency Array (LOFAR)" -- a network of 25,000 antennas spread across densely-urbanized Europe, with headquarters in the Netherlands -- features such wide-range amplifiers, with older radio telescopes being refitted with them. LOFAR also uses a trick in which an array of antennas are used to locate an interference source like an aircraft, with electronic steering then used to blank out signals coming into the radio telescope array from that direction.
Astronomers have been weeding out bogus signals from observational data for a long time, but in the past it's been a laborious manual procedure; now it's increasingly automated. Much of the interference is transient, such as a transmission from an aircraft flying overhead; LOFAR uses the latest digital processing to help get rid of such transients. The key is that LOFAR can collect data with a very fast sampling rate, with intervals of time down into the nanoseconds -- and the fast sampling allows noise to be extracted with great precision, preserving more of the signal.
The latest processing software is also smart enough to spot interference. Signals from cosmic signal sources tend to follow Gaussian distributions, while those from terrestrial sources don't. Researchers at the New Jersey Institute of Technology in Newark and the Korea Astronomy & Space Science Institute have developed an algorithm to pick out Gaussian signals and discard the rest. Of course, it is also possible to filter out troublesome signals before they're even recorded, with some radio telescopes featuring filters in the 2 to 2.2 gigahertz (GHz) band, used by many wireless devices. Unfortunately, that affects the quality of the observations -- but it's better than losing them completely.
Radio astronomy observatories typically have spectrum managers whose job it is to deal with noise sources. For example, in 2005 the US National Aeronautics & Space Administration (NASA) was preparing to launch the "Cloudsat" environmental satellite; spectrum managers of NRAO's Very Long Baseline Array (VLBA) -- a set of ten radio dishes scattered across the USA, from the Virgin Islands to Hawaii -- realized that Cloudsat's operational frequency of 94 GHz was within the 80 to 96 GHz band used by the VLBA. If some of the antennas were stowed straight up, 90 degrees, their receiver systems could have been fried by Cloudsat. They were stowed instead at 87 or 88 degrees, eliminating the threat posed by the spacecraft, though its interference still had to be weeded out.
Spectrum managers also handle community outreach. At the Giant Meterwave Radio Telescope (GMRT) in Pune, India, for example, managers were able to persuade cellphone network operators not to place towers within 20 kilometers (12.4 miles) of the GMRT. Local officials also agreed to help keep high-voltage lines free of stray wires that also produce interference; they leak power anyway. At the GBT, managers have been able to persuade some local wi-fi users to switch to wired connections by giving them free high-quality modems.
Spectrum managers say the public is usually happy to help. Two years ago, the GBT was to be used to inspect a distant galaxy for a distinctive spectral line at 855 megahertz (MHz). The problem was that a local electric utility used transmitters on that frequency to control their network systems. Spectrum managers simply asked the utility to shut down the transmitters on weekend nights; when that didn't work out, the utility agreed to shut them down for longer intervals.
New challenges keep popping up. The latest generation of "smart" automobiles feature radars operating at 76 MHz to maintain a safe distance to other vehicles. One such radar could fry a radio telescope's sensitive receivers, and radio astronomers have been lobbying communications authorities with their concerns over the matter. The rising flood of wireless annoyance is why new radio telescopes, such as the Square Kilometer Array, are being sited in isolated regions such as Western Australia or the Northern Cape province of South Africa, whose governments have promised to protect them from interference. Elsewhere, however, spectrum managers have their work cut out for them in ensuring that their radio telescopes aren't swamped in a sea of radio noise.COMMENT ON ARTICLE
* THE GENOME DATA CRUNCH: Since the turn of the century, the decoding of genomes of various organisms has become routine, with costs of sequencing dropping rapidly and the rate of sequencings accelerating. As discussed by an article from AAAS SCIENCE ("Will Computers Crash Genomics?" by Elizabeth Pennisi, 11 February 2011), this genomic golden age now risks being bogged down in a hidden problem: how to store and handle the massive quantities of genomic data being produced in ever-increasing volumes.
Agencies funding genome research have been slow to recognize the problem. Up to the middle of the last decade, it wasn't actually an issue, since genome research organizations typically had the resources on hand to do the job -- but the data was beginning to pile up. In 2007, the US National Center for Biotechnology Information (NCBI) in Bethesda, Maryland, was able to make its "GenBank" database, loaded up with 150 billion bases of genetic information, available to researchers who needed it.
Then the flood began as new, relatively low cost, faster sequencers began to hit the market. The challenge presented by the ballooning rate of new genomic information was compounded by the fact that the new fast sequencers generally produced sets of very short sequences, from about 50 to 120 bases, that had to be spliced together by software, increasing the load on the data systems supporting the genomics effort. Add to that the challenge of analyzing and comparing sets of hundreds, even thousands of different genomes, and it's no wonder genomic researchers started to panic. It wasn't just a question of lacking the hardware and software, there weren't enough people available who had the proper background in bioinformatics to actually do the work.
By 2005, some could see the writing on the wall. Two bioinformaticists, James Taylor of Emory University in Atlanta, Georgia, and Anton Nekrutenko of Pennsylvania State University, decided to build a software "framework" named "Galaxy" that integrated genomics databases and popular genomics software tools in a single, easy-to-use package. Galaxy can be downloaded to a PC or run off Penn State's servers via the internet, allowing an investigator to perform basic genomic analysis without any special local resources.
Although Galaxy helped address the software integration problem, it did nothing in itself to deal with the rising volume of genomic information, and Penn State's servers were clearly unable to keep up with the load. There was no way around that problem except more hardware, but Taylor and his colleagues realized there was a potential solution at hand: "cloud computing" services that offer access to huge internet-connected data centers. As discussed here recently, Amazon, Microsoft, and Google all provide cloud-computing services for a price; there are also non-profit organization involved in cloud computing, such as the "Open Cloud Consortium". Taylor and his colleagues ported Galaxy to run on a "virtual computer" based in the cloud and performed a trial genomic analysis on it in 2010; the test would have bogged down Penn State's servers, but it was completed in an hour by the cloud, at a cost of only $20 USD.
Other genomic research groups have run similar trials and have found the cloud a remarkably cheap way to get the job done -- not merely because of the low costs of cloud services but because there's no need for the researchers to maintain a large system of their own. In addition, establishing a master genomic data system on the cloud would allow convenient access for anyone in the world, without duplications of databases on separate server systems that can lead to confusions.
However, some warn that cloud computing services are immature -- unforeseen problems arise, and there are worries about security -- and it doesn't seem to be time yet to rely on the cloud to provide a "one-stop" genomic data system. In addition, despite the massive computing horsepower of the cloud, it can't really do the job for an analysis that requires massive fast interchange of data between the sub-elements of the job. For that, there's no substitute for dedicated supercomputers.
To be sure, cloud computing does provide massive amounts of storage at reasonable cost, but despite the fact that the price per bit of mass storage has steadily dropped over the past decades, the amount of genomics data is increasing much more rapidly, and even the cloud won't be able to keep up at the rate things are going. It is becoming increasingly accepted that, eventually, researchers will only be able to retain the results of a study and will have to discard the raw data. However, for the time being there's some reluctance to take that step.
One of the biggest problems right now is funding for more bioinformatic resources, particularly in training and hiring a new generation of bioinformaticists. Funding agencies are aware of the issue and discussing it, and overall the genomics community is becoming upbeat about the odds of dealing with the data crisis. Says one researcher: "Do I think these problems will be solved? I'm optimistic."COMMENT ON ARTICLE
* CREATIVE CHAOS (3): Efforts are being made to address India's problems. Current Indian government plans envision spending $500 billion USD in the 2007:2012 timeframe on infrastructure. Spending is likely to pick up, since the government is getting more skillful at promoting private investment.
As far as education goes, even illiterate parents now want their kids to get schooling, urging them to study in hopes that they will find a job in a call center, and then move up from there. In response to lousy public schools, cheap private schools have been springing up -- in rural areas, more than 20% of Indian students, most of them poor, attend private schools, with the schools helping to highlight just how bad the public school system is. The literacy rate is growing fast; among 15:24-year-olds it's over 80%, though boys do better than girls.
Reforms are slowing working their way through the political machinery. A proposed national sales tax would rationalize the disjointed patchwork of state and local taxes. A proposed land law would simplify and speed up sales of land for infrastructure and businesses. India's regulatory tangle hasn't disappeared, but it is slowly shrinking.
Some investors find India's untidy democracy difficult to deal with. Not only is there the confusing overlap and clash of state and national laws, but there's the mad factionalism of the politics. Even when laws get passed, they may not stay passed; it can be hard to figure out who's in charge. Says a Western banker: "Don't tell the Indians I said this, but I'm much more comfortable with China than India."
That's a biting remark, since China's governance is clearly far from perfect, and few would think China's people are well-served by its government's repressiveness. India clearly has a ways to go, but over the long run it may pass by China. Authoritarianism doesn't cope with change well; unlike China, Indians can obtain a new government without a revolution. India may be chaotic, but there's a certain energy and flexibility in that chaos that can reward those who are able to ride the storm.
* In related news, the effort by the government of India to assign all the nation's citizens a "Universal ID (UID)" number -- like a US Social Security number, but more robust -- was discussed here a few years back. An article in THE ECONOMIST ("Identifying a Billion Indians", 29 January 2011) discussed the progress of the effort, and outlined the opportunities it presents.
Each Indian who applies for a UID must fill out a form -- since illiteracy is common in rural India, sometimes applicants need help on that score -- and then submit to a fingerprint scan and an ID photo. The wheels of bureaucracy then turn and a few weeks later, the person has a UID. The procedure is straightforward and millions have been processed so far, but that quantity has to considered in light of the fact that there are 1.2 billion Indians.
Think of the nightmare the staggering scale of the job presents. How does the system handle the biometric data of over a billion people? How does the system ensure the data is accurate? When an individual is enrolled, the result is a data set that then has to be compared to all the others to determine if there's a duplication; when enrollments reach a million a day, that will mean the system will be performing matches continuously at a rate of 14 billion per second.
The task was awarded to private contractors using an innovative scheme. There are three contractors: Accenture and L-1 Identity Solutions of the USA, plus Morpho of France. The firm that does the best and fastest gets 50% of the work; the others get 30% or 20%. The allocation is frequently reassessed, so if the second-best firm starts doing better, it picks up work from the leading firm. That keeps everyone on their toes.
The goals of the effort are to improve government services and reduce corruption. It is estimated that a staggering two-thirds of the subsidized grain the Indian government allocates for the poor is either stolen or adulterated; when middlemen say they have delivered so many bags of rice to so many thousands of peasants, there is no way to tell if they are lying. However, if the peasants are given iris scans when they pick up their rations, it will be harder to scam the system. Similar controls could be used to deal with voter fraud.
An effective UID scheme would also smooth financial transactions. Some 42 million poor households work under government program that guarantees them up to 100 days of work at the minimum wage each year. The money is welcome, but the trip to the bank to get it is not. The banks are in urban areas, and so poor peasants lacking their own transportation have to make the trek to the bank and back home again, wasting up scarce money and time they don't have to spare. Under the new order, scanners will be supplied to village shops and hooked up to distant banks via mobile phone. A peasant farmer could walk into a shop, scan his fingerprints, and authorize a transfer to the shopkeeper -- who would then pay off the farmer, less a small fee. The shopkeepers are particularly excited; not only will they profit from the vast numbers of transactions, but they will also have a customer base with more money to spend.
Since the UID system is an open platform, businesses will be able to leverage innovative applications onto it. Hospitals could match medical records with patients who are far from home; insurers would be able to perform background checks on prospective clients. Microfinance schemes -- which have been hobbled by the inclination of cheats to take out loans from several lenders in parallel and then walk off -- will benefit from the increased security. Schools will benefit as well by being able to track student records, even when students change schools.
There is of course a security angle to the UID; India suffers from a serious terrorist threat. However, part of effort has been to make sure that the privacy of the law-abiding citizens is adequately protected. Even at that, companies believe that the UID scheme will give them much more information about the customer base -- that is, presuming that India can get things to work. If so, the UID scheme would not only help the wheels of society run more smoothly, but demonstrate that government programs can be effective and credible. [END OF SERIES]START | PREV | COMMENT ON ARTICLE
* THE KILLING OF JFK -- THE BALANCE OF EVIDENCE (17): As a footnote to the discussion of the analysis of the composition of the bullets, what about the composition of the bullet recovered from the attack on General Walker? The bullet -- Warren Commission exhibit CE 573 -- was too mangled to permit identification as a Carcano round, much less determine that it had been fired from Oswald's rifle, but it was consistent with a Carcano round, a copper-jacketed (not steel-jacketed) bullet. The only evidence linking Oswald to the attack on Walker was the testimony of Marina Oswald; the "if I am captured" note in Russian he left behind, which was validated as being in his handwriting; and some photos that Oswald took of Walker's house that he failed to destroy when he burned his file on the general.
Conspiracy theorists play up various "suspicious features" in the Walker case: a 14-year-old boy in a neighboring house claimed that immediately after the shooting he saw two men, in different cars, leave a church parking lot next to Walker's house, while a friend of Walker's testified that two nights before the attack, he had seen "two men around the house peeking in windows." Both "leads" were investigated and neither went anywhere.
Conspiracy theorists have also wondered how Oswald could have missed the shot at Walker when he scored hits from the TSBD. That does sound puzzling on the face of it, but Walker himself pointed out that he only survived due to a freak of luck: the bullet hit the slender piece of wood framing in the middle of the window and was deflected, the framing being invisible to the shooter in the more distant focus of the scope. In addition, conspiracy theorists have claimed CE 573 was reported by the Dallas police to be a "steel jacketed" bullet, but if that is so, all those who formally investigated CE 573 contradict that assertion, having concluded that it was consistent with if not specifically identifiable as 6.5 millimeter Carcano ammunition.
* As another footnote, the discussion of the fragments found in the limousine leads to the issue of the limousine as forensic evidence in itself. Conspiracy theorists like to claim that the limousine was promptly refurbished to eliminate all inconvenient evidence, but that wasn't the case. It was taken to the White House garage on the evening of 22 November and then given a fine-tooth combing by FBI and Secret Service agents to extract all possible evidence. It was refurbished, but only months later; obviously, if somebody thought a "coverup" was needed, no need was seen for hurrying.
Along with the fragments of bullets recovered from the limousine, it also featured a crack in the windshield, not visible in photos of the vehicle before the shooting and so likely to have been caused by the shooting. Some conspiracy theorists claim that a bullet penetrated the windshield from the front, but there wasn't a hole -- just a crack. The windshield was made of auto safety glass, with two layers and a plastic film between them. As FBI Agency Robert Frazier told the Warren Commission, FBI agents recovered traces of bullet lead from the interior layer; there were cracks in the outer layer, but no traces of lead were present. There was also a dent in chrome trim after the assassination; Frazier believed it was due to a fragment hitting the trim from the inside, possibly the same fragment that ended up on the driver's seat.
Another conspiracy tale that grew up around the limousine was based on the fact that the presidential seat could be raised to make the president more visible to the crowds. The story goes that the seat was raised before arrival at the TSBD to make JFK a better target. Actually, photos exist showing JFK on the seat in the raised position from other times, and it is clear by comparison with the Zapruder movie that the seat was in its normal and lower position. [TO BE CONTINUED]START | PREV | NEXT | COMMENT ON ARTICLE
* Space launches for April included:
-- 04 APR 11 / SOYUZ TMA-21 (ISS) -- A Russian Soyuz booster was launched from Baikonur in Kazakhstan to put the "Soyuz TMA-21" AKA "ISS 26S" manned space capsule into orbit on an International Space Station (ISS) support mission. The crew included Soyuz commander Aleksandr Samokutyayev (first space flight), flight engineer Andrei Borisenko (first space flight), both of the RKA, and NASA astronaut Ronald Garan (second space flight). They docked with the ISS Poisk module on 6 April, to join the "ISS Expedition 27" crew of station commander Dmitriy Kondratyev, Italian astronaut Paolo Nespoli, and NASA astronaut Catherine Colman, who arrived at the station in December 2010.
-- 11 APR 11 / BEIDOU ISGO 3 -- A Chinese Long March 3A booster was launched from Xichang to put the "Beidou ISGO 3" navigation satellite into orbit.
-- 14 APR 11 / NROL-34 (USA 229) -- An Atlas 5 booster was launched from Vandenberg Air Force Base to put a secret military payload into space for the US National Reconnaissance Office (NRO). The spacecraft was designated "NROL-34" AKA "USA 229". The booster was in the "411" configuration, with a 4 meter (13 foot) diameter fairing, one solid rocket booster, and an upper stage with a single Centaur engine.
-- 20 APR 11 / RESOURCESAT 2, SMALLSATS x 2 -- An ISRO Polar Satellite Launch Vehicle was launched from India's Sriharikota launch center to put the "ResourceSat 2" remote sensing satellite into space. ResourceSat 2 had a launch mass of 1,205 kilograms (2,658 pounds), a design life of five years, and carried three visible / infrared imagers with a best resolution of 5.8 meters (19 feet), along with an AIS ship-tracking secondary payload provided by Com Dev of Canada. The launch also included the "XSAT" demonstration satellite for Singapore, and the "Youthsat" spacecraft for India and Russia; each had a launch mass of about 90 kilograms (200 pounds).
-- 22 APR 11 / YAHSAT 1A, INTELSAT NEW DAWN -- An Ariane 5 ECA booster was launched from Kourou to put the "Yahsat 1A" and Intelsat "New Dawn" geostationary comsats into orbit. Both satellites provided communications services for the Middle East, Africa, and Europe. Yahsat 1 was built by EADS Astrium for Al Yah Satcom Company of Abu Dhabi; it was based on the Eurostar 3000 bus, had a launch mass of 5,965 kilograms (13,160 pounds), carried a payload of C-band / Ku-band / Ka-band transponders, and had a design life of 15 years. It was placed in the geostationary slot at 52.5 degrees East longitude.
Intelsat's New Dawn satellite was built by Orbital Sciences Corporation and was based on the Orbital Star 2 comsat platform. It had a launch mass of 3,000 kilograms (6,600 pounds), carried a payload of C-band / Ku-band transponders, and had a design lifetime of 15 years. It was placed in the geostationary slot at 32.8 degrees East longitude.
-- 27 APR 11 / PROGRESS 42P (ISS) -- A Soyuz booster was launched from Baikonur to put a Progress tanker-freighter spacecraft into orbit on an International Space Station (ISS) supply mission. The spacecraft was designated "Progress 42P" AKA "Progress M-10M". It docked with the station's Pirs module on 29 April.
* OTHER SPACE NEWS: As reported by SPACEFLIGHT NOW Online, China is now pressing forward on deployment of a modular space station by 2020. As currently envisioned, the station will be roughly along the lines of the Russian Mir space station, consisting of three modules and supported by a tanker-cargo spacecraft. The core module will be over 18 meters (60 feet) long, with habitation spaces with an internal diameters of up to 4.2 meters (13 feet 10 inches). After launch, the core module would be augmented by twin experiment modules, with the resulting complex having a mass of about 59 tonnes (65 tons). Once established, the station would be able to support a permanent crew of three, supported by tanker-cargo spacecraft and ferried by Shenzhou crewed space capsules.
The Chinese will take an initial step towards the station with the launch of the "Tiangong 1" spacecraft in the second half of 2011. Tiangong 1 -- the name means "Heavenly Palace" in Chinese -- will be an 8.6 tonne (9.5 ton) space station demonstrator intended to support docking trials and other procedures required for space station operations. Tiangong 1 features a docking port on the forward end of the spacecraft, supported by navigation and communications kit mounted there as well. The spacecraft will be lofted into orbit by a Long March 2F booster from the Jiuquan space center in the Gobi Desert. Two months after the launch of Tiangong 1, the unmanned "Shenzhou 8" capsule will be launched to perform initial docking experiments. Two crewed Shenzhou capsules are being built for launch in 2012; each will spend days to weeks at Tiangong 1, which is designed to support two years of operations.
China's next five-year strategic plan includes manned space missions spanning at least 20 days and the design and construction of an automated tanker-cargo spacecraft. Although China's space station announcement outlined a much smaller outpost than the US-driven International Space Station, the country also has plans for a heavy-lift rocket more powerful than any vehicle in the expected US inventory over the next half-decade. Such a rocket appears intended for heavyweight construction missions or crewed flights to the Moon and beyond.COMMENT ON ARTICLE
* DIAMOND FOR QUANTUM COMPUTING: As reported by an article from AAAS SCIENCE ("Diamond Feats Give Quantum Computing A Solid Boost" by Robert F. Service, 6 August 2010), one of the interesting facts of quantum physics is that if a particle like an electron can exist in two states -- one example being the state very loosely named "spin", which can have UP or DOWN values for an electron -- then until the electron is observed, it will effectively exist in both states -- UP and DOWN at the same time. It's not a question of it being in one state or the other and we don't know what it is; it really is in both states at once.
Everybody admits that sounds absurd, but in principle such "quantum superposition" can be put to practical use in what has become known as "quantum computing". If the state of an electron or other quantum system that can take on such superimposed dual states is used to represent a binary digit, a "0" or "1", then a quantum computer designed to process such "quantum bits" or "qubits" could performed a calculation on an entire range of values at one. Eight qubits could represent 256 different values, and all 256 values could be processed in one shot; take the number of qubits up to 100 and the number of values processed at once would be 1.27E30.
This isn't a purely theoretical issue either, since lab demonstrations have shown it can work, and a number of algorithms have been devised that can take advantage of the unique parallel power of a quantum computer. The problem is that the lab demonstrations haven't been remotely practical for building a useful quantum computer, amounting to very simple experiments requiring a lot of expensive gear. Now a team of researchers under David Aschalom at the University of California, Santa Barbara (UC-SB) reports that they have developed a scheme to fabricate on a diamond wafer large arrays of devices to store qubits.
In 2006, researchers found out that when they inject nitrogen atoms into a wafer of crystalline atoms, the nitrogen atoms would not only insert themselves into the into the diamond lattice, but also occasionally evict carbon atoms, leaving behind a "vacancy". If a nitrogen atom ends up paired with a vacancy, one of its electrons can form a stable qubit based on its spin state -- which can be interrogated by radiofrequency (RF) signals, microwaves, or laser light. One of the difficulties in developing qubit storage devices is that the superposition of states is usually very unstable and hard to maintain long enough to perform a calculation; however, the "nitrogen-vacancy (NV)" centers, effectively "insulated" by the diamond lattice, are highly stable.
Fabricating NV centers in diamond wafers has proved troublesome, but the UC-SB researchers used electron-beam lithography to etch 3,600 holes in a mask on top of a diamond wafer in a 60 x 60 array, and then used a beam of nitrogen atoms to create the NV centers. They also came up with schemes of using RF signals to interact with the qubits in the NV centers. The result is still nothing resembling a practical quantum computer, but it is much more promising than other experiments to date, and has given researchers confidence that they will actually be able to build a workable quantum computer.COMMENT ON ARTICLE
* DIMINISHING RETURNS FOR SUPERCOMPUTING: Every year brings announcement of a new supercomputer with performance that trounces last year's champion. Given such dramatic advances, can we expect to see supercomputers in 20 years' time that make our current machines look like desktop PCs in comparison? According to an article from IEEE SPECTRUM ("Next-Generation Supercomputers" by Peter Kogge, February 2011), the answer is NO.
Modern supercomputers are based on clusters of tightly interconnected microprocessors. For decades, as per Moore's Law microprocessors have become ever faster while their transistors have become smaller -- but about five years ago, microprocessors maxed out at a clock rate of about 3 gigahertz (GHz). The problem is not that the transistors can't run any faster, they can, but the chip would then have to dissipate excessive amounts of heat. This "power wall" presents an obstacle to continued scaling-up of current supercomputer technology.
In 2007 the US military's Defense Advanced Research Projects Agency (DARPA) set up a study group directed by the author to investigate what would be required to build a supercomputer capable of 10E18 floating-point operations per second -- an "exaflop" -- by 2015. The conclusion of the group was discouraging: it wasn't going to happen by 2015, and it might not even be possible in the foreseeable future. In addition, it wasn't going to be easy to make current "petaflop" computers -- capable of 10E15 flops -- much cheaper, more compact, and less power-hungry. As the author put it: the party isn't exactly over, but the cops have arrived, and the music has been turned way down.
* Progress in the past was surprisingly rapid. A leading-edge supercomputer of the 1980s could operated at a gigaflop -- 10E9, a billion flops -- with technology then advancing to this time to increase the speed by a factor of a million, resulting in petaflop machines. The world champion at last notice was China's Tianhe-1A supercomputer, which established a record in late 2010 of 2.57 petaflops. Compared to the advance from the 1980s, to get to an exaflop would only require increasing the speed by a factor of less than 400. What's that compared to the previous increase of a factor of a million?
A lot, it turns out, thanks to the power wall. A modern supercomputer usually burns between 4 and 6 megawatts (MW) of electricity -- enough to supply something like 5,000 homes. Researchers at the University of Illinois at Urbana-Champaign's National Center for Supercomputing Applications, IBM, and the Great Lakes Consortium for Petascale Computation are now constructing a supercomputer named "Blue Waters", which will be capable of 10 petaflops -- but will burn 15 MW of power, if the power to drive cooling systems is factored in.
10 petaflops is two orders of magnitude less than an exaflop; just scaling up the current technology to an exaflop would mean power consumption of 1.5 gigawatts, about a tenth of a percent of all US power generation at present. Such a supercomputer would need its own dedicated nuclear power plant right next door. The goals set for the DARPA board, in contrast, specified an exaflop computer that would only require 20 MW.
To determine if that was possible, the DARPA study group examined the power requirements per flop of computer capability. At the time, computer circuitry required about 70 picojoules (pJ) -- a picojoule being 10E-12 joule, a joule being a watt-second. Further investigation showed the energy requirement should fall to 5 to 10 pJ a flop in between 2010 and 2020. Unfortunately, that doesn't really help much, because simply performing floating-point operations is useless if data isn't retrieved from and stored back to memory, and that takes energy too. The study group concluded that the actual power required per flop would be from 1,000 to 10,000 pJ.
The conclusion was that that current technology would have be largely rethought. One idea was to reduce the voltage levels used by the supercomputer logic chips, currently about 1 volt; the study group picked a half volt as a candidate. That would cut power consumption, since power is proportional to the square of voltage, but it also would cut speed and make circuits more vulnerable to noise.
One of the study group members, Bill Dally -- then at Stanford and now chief scientist of Nvidia Corporation -- came up with a paper design based on the low-voltage concept. His basic module consisted of a chip with 742 separate microprocessor cores running at 1.5 GHz. Each core included four floating-point units and a small amount of nearby memory, a cache, for fast data access. Pairs of such cores shared a slightly slower second-level cache, and all such pairs could access each other's second-level and even third-level memory caches. One unusual feature of the design was that each processor core was linked to its own set of 16 dynamic RAM blocks; each processor chip also had ports for connections to up to 12 separate routers for fast off-chip data transfers.
One of these processor-memory modules by itself would be able to run at almost 5 teraflops; 12 of them could be packaged on a single board, and 32 such boards would fit in a rack, which would then provide best performance of close to 2 petaflops. An exaflops-class supercomputer would require at least 583 such racks, which exceeded DARPA's target of 500 racks, but was not far out of the ballpark. Unfortunately, it would also require 67 MW of power, over three times as much as required by the DARPA target.
There was another problem, in that the exaflop computer would have 160 million microprocessor cores, and it would be hard to keep them all gainfully employed. Realistic applications running on today's supercomputers typically use only 5% to 10% of the machine's peak processing power at any given moment; most of the other processor cores are just treading water, waiting for data they need to perform their next calculation. It has proved very difficult for programmers to keep a larger fraction of the processors continuously busy on an application, and the problem gets worse as the number of cores increases. In other words, even though it might be possible to build an exaflop computer, it would be inefficient because it would only be about 10 to 100 times faster than a petaflop computer.
A closer look at the pattern of memory usage of this supercomputer made matters worse, suggesting it would actually draw 500 MW of power. Then there was the issue of mass storage -- a supercomputer handles lots of data, and it needs someplace to store it. The study group concluded that current mass-storage technology couldn't do the job, but was unable to identify any new technology that seemed particularly promising. Finally, as mentioned, dropping the voltage means more noise, and an analysis showed that a system with 160 million microprocessors would be challenged to run reliably without additional hardware that would push up power consumption still more.
* So is an exaflop computer impossible? There's no reason to rule it out, but it's not going to be easy, demanding new electronics, new architectures, new software. DARPA is working to address the challenge with a new program titled "Ubiquitous High Performance Computing", the general push not being to come up with much faster supercomputers, but to make existing supercomputers cheaper and more compact. The US Department of Energy and the National Science Foundation are funding similar investigations, aimed at creating supercomputers for solving basic science problems.
Global computing power continues to grow, of course, and the amount of "cloud computing" power available for solving certain classes of problems -- those that can be split up into parts, with each part solved independently -- will continue to grow, almost inevitably yielding an effective exaflop computer. However, for the tightly-coupled problems that have been the traditional domain of supercomputing, it won't be a substitute for the real thing.COMMENT ON ARTICLE
* CREATIVE CHAOS (2): India's "notoriously rough" roads suggest the challenges faced by India's boom. Those who like to compare India's growth to China's sometimes forget that China's economy is four times bigger than India's, and that means India isn't going to catch up soon, with a number of obstacles hobbling progress. The rough roads are a sign of one of the biggest problems: poor infrastructure. Roads are in bad condition, traffic lights don't work, city traffic is snarled. India has nothing to compare with the US interstate highway system, with the added complication that going between Indian states can be as or more troublesome in terms of border crossing bureaucracy than going to a foreign country. With infrastructure so uncertain, big operations that want to keep running on a regular basis have to supply their own power backup, water treatment, and transport for employees. There are opportunities in the infrastructure challenge, with Indians tending towards the clever on new ways of doing things, but everyone still wishes things would work better more of the time.
Estimates suggest India's economy is poised to grow by a factor of five to 2030, with the urban population likely to double from the 2001 census figure of 290 million to about 590 million. That implies spending about $1.2 trillion USD on urban infrastructure in that timeframe, which is about eight times the current rate. Per person, China's capital spending on cities is about seven times that of India's.
Another problem is the low level of education. 40% of the workforce is illiterate -- in typical developed nations, the number of illiterates is less than 1% -- and another 40% never finished school, while India needs hundreds of thousands more professionals than the country's educational system can churn out. India's best universities, the Indian Institutes of Technology, are first-class, but there's only 16 of them, and while they're good at teaching theory, they're not so good at teaching practical skills. Employers have to train graduates up for months, only to find them poached away by rivals.
The public school system is a nightmare, with supplies simply disappearing and teachers sometimes not bothering to show up -- their jobs are protected by law, they can't be fired, and some just don't care. Widespread malnutrition among young children hinders their brain development, making them slow students. The government set up a program to deliver cheap grain to the poor, but it became a scandal, an ugly example of the "old India" at work, with much of the grain siphoned off to private pockets.
To be sure, the difficulty in finding skilled workers is good for workers with skills, since they can often name their own price, and change jobs if they don't like the one they've got. However, the shortage of competent builders, electricians, and plumbers is troublesome, with widespread horror stories of bungles on building sites -- a matter which made global news during the run-up to India's hosting of the Commonwealth Games, with the facilities set up for the games roundly condemned.
Yet another big difficulty is social instability. Hindu-Moslem tensions linger, but the big problem at present are the Maoist insurgents known as the "Naxalites" -- discussed here in 2008 -- common in India's backwoods and rural districts. They make life difficult for mining and logging firms, which have drawn populist wrath by heavy-handed evictions of villagers to make way for their operations. The Naxalites don't seem to have much sympathy in the cities and so haven't otherwise affected Indian businesses; the businesses mostly worry that Indian politicians will bend to the anger and take the populist road, hassling the business community to win points with voters.
Then's there's corruption, long a problem in India, still a problem there. The prime minister, Manmohan Singh, has been widely respected for his honesty, but partly because he's admired because he seems unusual, and skepticism about his willingness or ability to tackle corruption has been on the upsurge after recent high-profile scandals. Some estimate that a third of India's politicians and public officials are on the take. Indian businesses are generally honest, or at least as honest as they are elsewhere, but they're stuck with dealing with corrupt officials, and the rot tends to taint the entire system.
Finally, the success of India has led to challenges from abroad, in the form of backlash and copycats. The governor of the US state of Ohio made headlines in India by banning Ohio state agencies from offshoring their work. Some American firms are offshoring call centers in the Philippines, where work costs are low and the culture is much more closely tied to the USA, the land having been a US colony for about half a century. [TO BE CONTINUED]START | PREV | NEXT | COMMENT ON ARTICLE
* THE KILLING OF JFK -- THE BALANCE OF EVIDENCE (16): No intact bullet was recovered from the head shot; the bullet disintegrated on striking JFK's skull. Of course, a number of fragments were found, with a pattern of fragments spread through JFK's head, and five other fragments found in the presidential limousine:
Given the generally intact bullet CE 399 and the various fragments, the question arises as to the relationships between them. Were they from two bullets, or more than two bullets? What fragments were associated with which bullets?
Forensic science has means of analyzing the chemical composition of bullets and bullet fragments to determine a chemical "fingerprint" of a particular bullet. The simplest is "spectroscopy": vaporizing a sample and then using a "spectroscope" to analyze the patterns of colors emitted by the gaseous sample, with the patterns of colors being distinctive to the elements in the sample. The problem with spectroscopy is that while it can be used to determine what elements are in a bullet, it's not so good as a means of determining their specific proportions.
A more sophisticated technique, "neutron activation analysis (NAA)", involves bombarding a sample with neutrons to make it radioactive, and then observing the high-energy "gamma rays" emitted by the sample as it decays. NAA is a very well established technique, used in many fields, capable of obtaining an accurate profile of the chemical composition of evidence using very small samples. The FBI conducted an NAA analysis in 1964 but, being unfamiliar with the procedure, judged the results inconclusive. The HSCA ran NAA tests again, the tests being performed by Dr. Vincent Guinn, a chemist from the University of California at Irvine and a recognized expert on the technique.
Guinn tested the bullet and fragments for their relative proportions of silver and antimony. His tests gave the following results, with the proportions given in parts per million:
exhibit notes silver antimony _________ _________________________________ __________ __________ A CE 399 stretcher bullet 8.8+/-0.5 833+/-9 B CE 842 fragment from Connally's wrist 9.8+/-0.5 797+/-7 C CE 567 fragment from front seat 8.1+/-0.6 602+/-4 D CE 843 fragment from JFK's head 7.9+/-0.3 621+/-4 E CE 840 fragments from floorboard carpet 8.2+/-0.4 642+/-6 _________ _________________________________ __________ __________
The clusterings of the readings suggest an association between A & B, and an association between C, D, & E. On reexamination of the 1964 FBI NAA tests, they gave surprisingly similar results.
* As a footnote to the issue of the bullet fragments, conspiracy theorists became very excited over the fact that the FBI obtained from the autopsy as part of the collection of evidence what was described in a receipt issued by the hospital as a "missile" -- implying a bullet, in fact implying a fourth bullet since the first bullet was never recovered, the second bullet was discovered at Parkland Hospital, and all the autopsy turned up was fragments of a bullet. The HSCA gave the matter a good looking-over and determined the receipt was misleading; what the FBI had been given relative to the receipt was two bullet fragments.
In any case, the NAA tests on the bullet fragments have been strongly criticized by conspiracy theorists, on the basis of their "chain of custody"; on the methodology used in the tests; and the interpretation of their tests. As far as the "chain of custody" issues go, it's basically saying that the evidence was faked; that's hard to disprove, but since there's no real evidence for fakery, it's also hard to demonstrate.
As far as the methodologies go, Guinn was an authority on NAA, and professional reviews of his work on the assassination have given him a "pass". Conspiracy theorists have sniped at the "peer reviewed" scientific literature, claiming it is biased against conspiracy theorists -- but scholars are notoriously nitpicky themselves, quick to challenge each other's work if they detect slop, while the technical "expertise" of conspiracy theorists invites skepticism.
It is well established that the chemical compositions of bullets vary widely. Guinn's own tests on a wide range of bullets show the proportions of silver and antimony vary over a great range between different bullets, requiring a logarithmic plot to display; the results on the five samples tested by Guinn are over a small range. Guinn also performed a "control" analysis on a box of Western Cartridge Company 6.5 millimeter Carcano ammunition, and discovered the composition varied widely between bullets from the same box, something he hadn't seen in his studies before. In other words, the NAA test was able to clearly discriminate between different types of bullets and even between Carcano bullets from the same box.
Conspiracy theorists have made a fuss over the fact that some studies show the chemical composition of a bullet may vary to a degree over different parts of the bullet. However, it is difficult to understand how that is supposed to make it less likely that fragments with similar chemical compositions came from same bullet; it doesn't logically follow and the argument is one of desperation. The bottom line is that the NAA evidence is consistent with the Warren Report conclusion of two shots striking home, and inconsistent with any other conclusion. [TO BE CONTINUED]START | PREV | NEXT | COMMENT ON ARTICLE
* SCIENCE NOTES: The inhabitants of the surprisingly elaborate microecologies of carnivorous pitcher plants, for example pea-sized frogs, have been discussed here in the past. WIRED Online reports that Nepenthes rafflesiana elongata pitcher plants in Borneo provide a tidy home for the thumb-sized wooly bat, with up to three bats hiding out in a pitcher during the day. While no doubt the bats originally just found the pitchers a convenient place to snooze, the pitcher now features adaptations such as an unusual length, an unusually low amount of liquid, and a restriction in its "throat" to keep the bats from falling to the bottom and drowning.
What's in it for the plant? Easy -- since carnivorous plants live in nitrogen-poor soils, they have to obtain nitrogen in other ways, normally from the corpses of insects. The bats pay their "rent" in the form of nitrogen-rich droppings. Researchers got curious about the plant when they observed that it only caught about a seventh the number of insects as other pitcher plants. One of the researchers called the presence of the bats in the plant as "totally unexpected."
* While carnivorous plants such as Venus flytraps, sundews, and pitcher plants are fairly well known, the "Utricularia" AKA "bladderworts" are obscure. The bladderworts are aquatic plants of global distribution, with little floating flowers and free-floating roots, the roots decorated with juglike traps resembling root nodules. They feed on small prey, in the range of size from water fleas to mosquito larvae to tadpoles; when prey trips one of the trap's trigger hairs, the trap inflates instantly and sucks the victim in.
Physicists at the French Laboratoire Interdisciplinaire de Physique used a high-speed camera to observe the bladderwort trap in action. The trap is normally sealed by a little door; it takes several hours to arm, with the water inside the sealed trap slowly pumped out. This bends the trap walls inward, placing them under elastic stress. When triggered, the door pops open inward, with the trap walls snapping back, sucking in water and the prey.
* The notion of "metagenomics" -- investigating the genomes of an association of organisms instead of just one -- has been discussed here in the past. Now a report from the US Department of Energy (DOE) Joint Genome Institute (JGI) shows how the JGI, working with the Energy Biosciences Institute (EBI), has used metagenomics in an effort to improve "cellulosic" biofuel technology.
While cellulosic plant materials such as switchgrass are tempting targets for biofuel production, being easy to raise on land not usable for crop plants, due to their tough cellulose and lignin molecules they are painfully difficult to break down into fuel. Ruminants have been breaking them down for a very long time, thanks to the microbiome of organisms in the cow "rumen", with these microorganisms having biochemistries capable of dealing with them. Researchers at the University of Illinois inserted samples of switchgrass bundled in nylon bags into a cow rumen -- which can be done by simply cutting a hole into a cow's digestive system and using a plug to seal it off; oddly, researchers rummaging around inside the cow don't bother the cow a bit -- and then removing them after 72 hours. The DNA of the assemblage of microorganisms digesting the switchgrass was then sequenced, to be sorted for useful enzyme sequences.
The DNA in the samplings ran to 270 billion base pairs, about two orders of magnitude greater than the human genome, and so the researchers necessarily screened out a specific class of genes, those coding for "carbohydrate-active enzymes (CAZymes)" potentially capable of breaking down cellulose. The screening yielded 27,755 candidates, with 90 that seemed promising tested for enzymatic effect and about 20% of them proving effective in breaking down switchgrass -- suggesting there were plenty of other potentially useful enzymes lurking in the untested genes. The researchers also assembled 15 complete genomes of rumen microbes, all of which proved substantially different from known microbe genomes.COMMENT ON ARTICLE
* BACK TO THE EARTH: As reported by an article from THE ECONOMIST ("Green Funerals", 16 September 2010), funeral practices turn out to have a substantial environmental footprint. A cemetery burial takes up land and the environmental costs of maintaining the landscaping add up steadily. Coffins of valuable hardwoods like oak and mahogany can seem like a dubious expenditure for someone who doesn't care about them. After burial, bodies can release pollutants like formaldehyde into the ground. Cremations also require a fair amount of energy, though unlike a burial, a cremation is mostly a one-shot deal; cremations also can contribute to heavy-metal pollution because the ashes contain mercury amalgams from tooth fillings.
While few want to give their relatives a disrespectful send-off, there's increasing interest in "greener" burials. A 2007 survey conducted by the AARP, an American senior-citizens' lobby, revealed that more than a fifth of respondents wanted greener burials, and later surveys have reinforced that impression. That can mean things such as coffins in the form of cardboard boxes or baskets of willow branches, both of which degrade quickly, but some have more radical visions.
One push is for people to be buried in natural habitats, not in a neat cemetery with rows of headstones. Although natural-burial grounds exist in locales ranging from America to Australia, they're particularly common in Britain, thanks largely to friendly regulations. The first natural-burial ground was set up in the UK in 1993; they now number well over 200. Ireland is now opening up its first natural-burial ground, with the management getting calls from indebted property owners, asking if their undeveloped land could be turned into natural-burial grounds, too. From the point of view of a land investment, it's not that good of a bet, since burial grounds can't be redeveloped and so are not particularly useful as collateral. However, they can be used for secondary purposes -- the new Irish natural-burial ground is also a managed forest.
New technologies may have their place as well. One new idea is "water cremation" or "alkaline hydrolysis", in which a corpse is dissolved in a heated solution of water and potassium hydroxide over the course of a few hours. The output is a liquid, which can be used as fertilizer, and an ash-like residue. Water cremation is much more energy-efficient than traditional cremation; facilities are not widespread yet, but they are beginning to appear in the UK, US, and Australia.
Another new approach involves freeze-drying a corpse in liquid nitrogen, then vibrating it to break it down into a powder. End processing evaporates the water, while sorting out mercury and other pollutants; the residue can also be used as fertilizer. The Swedish company that developed the process, Promessa, has been accused of over-promising and under-delivering, but company officials say they now have franchisees in the UK and South Korea.
Green funerals are still marginal, and the law hasn't completely caught up to the idea. Natural-burial grounds still take up land, though the fact that they are multiple-use areas helps. Certainly, there's likely to be public resistance to the idea of dissolving a grandparent and using the remains as fertilizer -- though cremation was also seen as outrageous when the practice was begun in the West. However, as long as more people keep specifying green funerals in their burial instructions, green funerals are going to become more and more visible -- and ultimately, possibly the norm.COMMENT ON ARTICLE
* JUNO TO JUPITER: As discussed by an article in AVIATION WEEK ("Juno To Jupiter" by Frank Morring JR, 21 March 2011), the US National Aeronautics & Space Administration (NASA) is preparing to launch a new probe to orbit the planet Jupiter this summer. The probe, named "Juno" after the Roman god Jupiter's wife, will be innovative in that it will be solar powered, while all previous Outer Planets probes have used radioisotope thermoelectric generators (RTG).
Juno features three solar panel arrays arranged around its central body, giving it a diameter of 20 meters (66 feet). The central body, which mounts communications antennas, instruments, and system electronics, is 3.5 meters (11 feet 6 inches) across and 3.5 meters tall. Since Jupiter has savage radiation belts, core electronics are stowed in a box with titanium walls over a centimeter thick. Once in orbit around Jupiter, Juno will spin at 2 RPM while it performs observations with a suite of nine instruments.
While Galileo, NASA's previous Jupiter probe, was placed into an orbit around Jupiter's equator, Juno will be placed in a highly elliptical polar orbit, rising high over the planet to drop down to an altitude of about 4,000 kilometers (2,500 miles) for close observations. Each orbit will take 11 days, with the probe taking six hours of observations on its closest approach, then returning data and being set up for the next pass before diving down again. Although the orbit will avoid the worst of Jupiter's radiation belts, the probe is still only designed to tolerate 30 orbits, less than a year's observations, before it fries. Juno is focused on observations of Jupiter itself and will pay little attention to Jupiter's elaborate Moon system.
Solar power was chosen for Juno to reduce cost; NASA has effectively run out of RTGs, and developing new ones promised to be expensive. No probe has ever operated with solar power that far from the Sun, five times as far away as the Earth. The solar panels are based on "ultra triple junction" gallium arsenide solar cells, built by Boeing's Spectrolab subsidiary, and with a conversion efficiency of 28%. The arrays have thick cover glass to protect them from radiation, though they will still degrade over time and lose efficiency. At the distance of Earth the arrays can produce 18 kilowatts, but at Jupiter they will only be able to generate 400 watts -- and half that power will be required for heaters to keep the probe's payload systems from freezing up. However, thanks to careful power budgeting, the probe will still be able to operate on 200 watts, or about enough power to run two bright incandescent bulbs.
Juno's science payload will have nine instruments and 25 sensors, including:
The probe will also carry a color camera named "JunoCam", mainly intended for publicity purposes. The spacecraft will have a launch mass of about 3.5 tonnes (3.85 tons), 2.5 tonnes (2.75 tons) of that being fuel. It will be launched from Cape Canaveral by an Atlas 5 551 booster in mid-August, to perform an Earth flyby for gravity assist in October 2013, and arrive in Jupiter orbit in July 2016. The Atlas 5 551 has a payload shroud with a diameter of 5.4 meters (17 feet 8 inches), five solid rocket boosters, and an upper stage with a single Centaur engine. Total mission cost is given as $1.09 billion USD.COMMENT ON ARTICLE
* CREATIVE CHAOS (1): The bright prospects for economic development in India were discussed here last year; an article from THE ECONOMIST ("A Bumpier But Freer Road", 2 October 2010), took a closer look at India's boom.
Welcome to the city of Gurgaon, near Delhi; it's become a center for global outsourcing, full of shiny new buildings, though villagers still herd goats along its streets while pigs root through the garbage. Pramod Bhasin, boss of the Genpact firm, explains that companies shouldn't waste their resources wading through paperwork when they can source it out to his people. A company's personnel department need a few people to deal face-to-face with employees, but form-filling and data entry is better farmed out to those who specialize in it. Says Bhasin: "I've got ten thousand people doing this. They're good at it."
Genpact began life as an internal unit of global giant General Electric before GE spun it off in 2005, to allow the organization to reach a wider range of customers. Genpact provides both outsourcing and consulting, helping customers such as hospitals figure out how to do dull paperwork sorts of things more effectively. Business is booming, and Genpact isn't alone: India's GDP is expected to grow by at least 8.5% in 2010. Some observers believe India's growth is beginning to outpace China's, and that in fact it will have the fastest growth rate of any large nation for the next few decades.
Why is India on a roll? One reason is demography: lots of young Indians who want to work. The proportion of Indians aged under 15 or over 64 has fallen from 69% in 1995 to 56% in 2010, according to the United Nations (UN). India's working-age population is poised to increase by 136 million by 2020, while the estimate for China is only 23 million. To be sure, the education level of India's workers is spotty, and so not all are suited for jobs in information technology and the like; oddly, unlike most developing Asian nations, India hasn't really been on a push for labor-intensive manufacturing that needs more warm bodies than brains. In any case, India stands to have of young and growing workforce, with the fact that English is commonly spoken being a real plus on the world market.
Another reason is the wave of economic reforms that took place in the early 1990s that knocked down tariff protections and scaled back the "license raj", the strangling web of permits and regulations that made getting things done very troublesome. Private firms are now exposed to global competition, with many finding themselves well up to the challenge.
Indian firms are increasingly international and some are world-class. Arcelor Mittal, based in Luxembourg, is the planet's biggest steel firm. Tata Motors, best known for ultra-cheap cars, also owns Jaguar and Land Rover, two long-recognized upscale British carmakers. Bharti Airtel, a mobile-phone firm with 140 million subscribers in India, is moving aggressively into Africa. China's growth is state-driven; in India, entrepreneurs are the movers and shakers. Go into Dharavi, a Mumbai slum with possibly a million inhabitants and full of narrow pathways impersonating streets, it seems every other door opens into a small business.
While Indian firms are exporting heavily these days, they're not ignoring the domestic market by any means. Since India still remains biased towards poverty, making money means providing goods that are cheap and serviceable, not fancy. Tata Chemicals makes a filter that can provide a family of five clean drinking water for a month for a mere 30 rupees (65 cents). Researchers at the Indian Institute of Technology and the Indian Institute of Science recently unveiled a prototype for a $35 USD laptop computer. A firm named Ayas Shilpa sells cheap, durable suspension bridges -- and does a very good business, India being a generally rugged country with countless villages whose only connection to the outside world over raging rivers and deep ravines has been frightening rope bridges. Indian firms have been exploring new business models, not just to reach out to poor customers, but also to figure out clever ways to bring in business from rich ones.
Foreign firms working in India are sharing in the boom. LG Electronics, a South Korean firm, is the biggest seller of gizmos in the Indian market, with a keen understanding of the India market. Since many Indians are vegetarians, the company sells a fridge with less freezer space and more drawers for veggies. Since Indians tend to like to play their TVs loud, LG TVs feature big speakers. The company sells voice-activated washing machines for middle-class families with illiterate maids; its products can deal with irregular power, and its packaging is extra-rugged since India's roads are notoriously rough. [TO BE CONTINUED]NEXT | COMMENT ON ARTICLE
* THE KILLING OF JFK -- THE BALANCE OF EVIDENCE (15): What really outrages conspiracy theorists about the idea that the head shot was from the rear is the fact that the Zapruder movie shows JFK's head jerking backwards on the impact. The shot had to have been from the front; if it had been from the back, wouldn't it have knocked JFK forward? The autopsy data had to have been faked -- intuitively obvious, right?
The difficulty with that intuition is that very few have observing enough people being shot to know if that intuition reflects reality. They see people get shot in TV shows and movies, and so their assumptions end up reflecting what they see in fictions. Michael Baden called such assumptions "absolutely wrong" and commented: "So much has been made of Kennedy's movement in the Zapruder film, and yet it is one of the least important parts of the case. By his movement alone, you can't tell which direction he was shot from. You then need to examine the bullets, the bones, tissue, X rays, and photographs to determine from where the bullet came. I have personally done thousands of gunshot autopsies. There is no doubt that the bullets that hit John Kennedy, both in the neck and in the head, came from the rear. Nothing hit him from the front."
John Lattimer suggested that when the bullet entered Kennedy's brain, it set off a massive spasm that arched him backward, a motion that could have been enhanced by the president's hefty back brace. As for conspiracy theorists who insist that simple physics says that something shot from the rear couldn't fall backwards, Luis Alvarez, a Nobel Prize-winning physicist, investigated the matter. What Alvarez pointed out was that the majority of the momentum transferred by the bullet passing through JFK's skull ended up in the pieces that were blasted out of the ugly head wound, with the reaction of this "jet effect" driving the president's head backwards. The Zapruder movie shows JFK's head bursting from the third shot; obviously, an explosion is going to blast things around.
Alvarez knew his reasoning might sound theoretical, so he made a video of shooting a watermelon from the rear -- with the watermelon falling backwards. John Lattimer constructed a dozen "artificial heads" and shot them, demonstrating much the same results. Conspiracy theorists denounced the experiments, saying that other experiments along the same lines didn't always result in a backwards motion relative to the bullet impact, and that the impact on the watermelon in the Alvarez video looked nothing like the head shot in the Zapruder film. However, this response was "moving the goalposts", since the experiments showed that the laws of physics didn't have a problem with the idea that something shot from the rear might fall backwards: it might not be guaranteed to fall backwards, but it wasn't ruled out. In addition, obviously shooting a watermelon wasn't going to look exactly like shooting a human in the head. Conspiracy theorists were unimpressed, some even hinting that Alvarez didn't understand basic physics.
* Incidentally, the popular MYTHBUSTERS TV program -- which examines various urban myths and hokum peddled in pop fiction -- once did an experiment to deflate the common Hollywood image of a victim flying backwards after being shot. The Mythbusters built a dummy and mounted it upright on a rack that allowed it to slide off backwards, demonstrating that the impact of a baseball would cause it to fall off. They then put a bulletproof vest on it and shot it at a range of 6.7 meters (22 feet) with a 12 gauge shotgun, firing a slug. The dummy simply slumped off the rack and fell on its back, despite the fact that the bulletproof vest soaked up the entire force of the impact.
Not being satisfied with this experiment, they then put a thick steel plate behind the bulletproof vest and shot the dummy again, this time with a 12.7 millimeter (0.50 caliber) sniper rifle at the same range. The dummy fell back about a hand's width and again landed on its back. The 12.7 millimeter bullet actually punched through both the bulletproof vest and the steel plate, being found lodged in the dummy's spine. In other words, the impact of an extremely powerful bullet doesn't toss the target around much even if the target completely soaks up the hit. So much for intuition. Interestingly, the Mythbusters showed clips from old-time Westerns, where the black hats get shot -- to simply groan and fall over: "Yuh got me, pardner." Hollywood eventually decided that wasn't dramatic enough, and decided to attach a rope to the back of a stuntman to jerk him flying backwards for a more spectacular effect.
* The Warren Commission never made much out of the fact that JFK's head bounced backwards, for the simple reason that all the forensic evidence showed that JFK had been hit from behind and that the direction his head bounced was not seen as all that significant. However, conspiracy theorists then made much of the issue -- but the HSCA, again leaning on the forensic evidence, less forgiveably glossed over it.
During the mock trial in the UK, Vincent Bugliosi knew that the jury would be impressed by the reasoning of the conspiracy theorists, thinking that "a picture is worth a thousand words", and felt he needed a counterargument. The moment of impact of the final shot is given in frames 312 and 313, with JFK's head blasting open in frame 313. Close analysis of these two frames shows that Kennedy's head has moved slightly but noticeably forward and down between the two frames, consistent with a bullet impact from the rear.
It might be argued that was due to JFK's movement before the bullet's impact, but in frame 313 contrast-enhanced images of the impact blast show the scattering to be in a clearly forward direction relative to the motion of the limousine. On seeing the contrast-enhanced images, it is also noticeable in the normal-contrast image. The pattern of the burst is consistent with a hit from behind -- obviously it's the top of JFK's head that's been blown out, the back of it is intact, with the spray towards the front -- but difficult to explain for a hit from the front. Secret Service Agent Roy Kellerman, in the front passenger seat of the limousine, reported that the debris from the head shot landed "all over my coat."
Conspiracy theorists also like to point out that the two police on motorcycles behind the presidential limousine were spattered with blood and bits of tissue, suggesting the shot was from the front. In fact, the Zapruder movie shows them driving into the cloud of spray thrown up by the head wound. Despite all demonstrations to the contrary, conspiracy theorists still insist that the Zapruder movie shows that JFK was shot from the front. [TO BE CONTINUED]START | PREV | NEXT | COMMENT ON ARTICLE
* GIMMICKS & GADGETS: As noted here several times previously, the US military has a strong interest in renewable energy as a way of reducing the logistical overhead of supplying fuel to frontline bases. Those who think that means the Pentagon is being taken over by tree-hugging hippies will have their preconceptions reset by one of the latest investigations by the Defense Advanced Research Projects Agency (DARPA), the military's "blue sky" investigations office: little nuclear reactors.
The DARPA "Small Rugged Reactor Technologies (SRRT)" study does not envision giant cooling towers dominating the horizon, instead looking for a reactor "well below the scale of the smallest reactors that are being developed for domestic energy production." The SRRT program also envisions the use of fuels, most prominently thorium, that cannot be enriched to create a bomb -- though nobody has yet built an operational thorium reactor.
The SRRT study is only funded at $10 million USD, which is tiny, meaning it's just a technical feasibility study. DARPA is looking at a range of possible technologies to reduce the energy footprint of a military forward operating base, with $50 million USD being allocated to investigate a more fuel-efficient gas turbine. Considering that each gallon of diesel delivered to the front lines costs Uncle Sam hundreds of dollars, there's plenty of motivation to be more efficient.
DARPA, leaving no stones unturned, is also attempting to address another part of the energy equation: storage. One of the biggest problems with renewable energy is that it's inconstant, since wind doesn't blow all the time and the Sun goes down at night, making energy storage a critical path. Under the "Deployed Energy Storage Program" DARPA is pursuing, by increments, an ultimate goal of a tactical power storage system that could store enough power to provide 100 kilowatts for 30 days. DARPA officials have made it clear that they will not only consider highly innovative ideas, they won't consider anything else.
There was a time when the military was at the leading edge of technology development, but went to the trailing edge when the digital electronics age arrived in force. Now it seems the military may be able to regain an edge, at least in some respects, such as advanced biomedicine and energy systems. And what makes DARPA's efforts along such lines seem so attractive is that the agency isn't pursuing such programs out of a blue-sky vision for a future America, but in pursuit of high-priority practical military needs.
* In this vein, regarding the item run here a few months back on the US Marine Corps (USMC) experimenting with frontline renewable energy in Afghanistan, the service reports it turned out an outstanding success and has been expanded. Two patrol bases are currently operating entirely on renewable energy, with a 90% percent reduction in fuel use at a third base -- with the Marines even able to conduct a three-week foot patrol without battery resupply.
The USMC is now investigating solar concentrator systems that will not only be more compact, but will also provide hot water. Another effort is being focused on making vehicles more efficient in idling; Marines often use idling vehicles as a source of power, and doing so gobbles up expensive fuel. The Corps is just asking for ideas right now, for example addition of "auxiliary power units" -- fuel cell systems for example -- or schemes to permit low-power use of a vehicle engine, for example shutting down some of the cylinders.
Obtaining fuel of course isn't the only difficulty faced by frontline troops; another problem is obtaining water, which in the worst case may have to be trucked in at great expense. The US Department of Energy's Oak Ridge National Laboratory (ORNL) has come up with a surprising approach to obtaining fresh water -- from the exhaust of diesel-powered vehicles. There's steam in diesel exhaust, and it can be condensed, ORNL researchers say cleanly, into water. The quantity's not trivial either, with a liter of diesel in principle capable of producing a liter of water, though in practice 75% recovery would be good. There doesn't seem to be any commitment to fielding the concept yet, however.
* Amazon.com's Kindle e-book reader has been a big hit. Now the company is extending Kindle's reach by setting up the "Kindle Library Lending" program, under which 11,000 US libraries will stock e-books online that can be checked out by library users. The e-books will have "timeouts" that disable them after the checkout period, and only one reader will be able to check it out a particular e-book at a time. Readers will be able to annotate books they check out; no other readers will see them, but the annotations will be saved for the next time the reader checks out the book.
While one applauds Amazon for their embrace of one of the most venerable "open systems", the public library, there's the puzzling question of how the business model works. No doubt it's a logical extension of the business model for print media relative to the libraries: a library buys a book, lends it out to readers with restrictions, and the authors don't complain. I suppose a case could be made that the author comes out ahead from library sales if library users aren't likely to buy the book themselves, and libraries have historical momentum protecting them. Traditionally, publishers have lived with the system, but now one publisher, HarperCollins, is attempting to sell libraries e-books that disable themselves after being checked out 26 times. Libraries haven't been enthusiastic.
* INTESTINAL FLORA VERSUS DIET: We have long known that our lower digestive tract is home to a diverse community of benign microorganisms, but it's only been recently that we've been learning just how elaborate our intestinal microecology really is. As reported by an article from THE ECONOMIST ("Hard To Stomach", 5 August 2010), recent research on the human "microbiome" has shown how diet has a strong influence on the makeup of our intestinal flora.
It is well established that while standards of public health are high in developed nations, Westerners in general and Western children in particular suffer from levels of asthma, allergies, and other inflammatory diseases at a higher rate than in in undeveloped countries. Researchers have been puzzled as to the cause of this anomaly. A study by Paolo Lionetti of the University of Florence in Italy and his colleagues suggests that at least part of the cause may be the impact of a Western diet on our resident microorganisms.
The Italian researchers compared the diets and gut bacteria of 14 healthy children from a village in the African country of Burkina Faso with a group of 15 Florentine children. The differences were slight at very young ages, with breast-fed toddlers in both countries featuring similar populations of gut bacteria. But among children who had been weaned, the two groups diverged dramatically. In Africa, fiber-rich meals of millet, legumes and other vegetables, along with the occasional termite, promoted a diverse mix of bacteria. European children, who ate typically high Western doses of sugar, fat and meat, had fewer microbial species.
Along with the difference in diversity, the researchers also saw differences in the composition of the bacterial populations between the two groups. Though the Florentine kids were healthy, the study revealed they were host to three times as many bacterial species associated with diarrhea, along with a bacterial profile associated with obesity.
The African kids, in contrast, had a bacterial profile associated with leanness, plus a higher proportion of microorganisms known to produce beneficial chemicals called "short-chain fatty acids (SCFAs)", linked to lower levels of allergies and inflammations. The African kids had more than twice the level of SCFA than the Italian kids. In other words, healthy bacterial populations living in our gut may not only drive out unwanted pathogens, they may also be promoting health indirectly.
This research does is point to possibilities, but given possibilities it is possible to perform tests. Researchers are now performing experiments to see if doses of beneficial bacteria could help deal with bowel disease and eczema. Our internal microecology is complex and will not yield its secrets easily; but we have now obtained a window into its processes and we are going to learn much more.
* In closely related news, a study performed by researchers at the European Molecular Biological Laboratory (EMBL) on 22 Europeans found that the subjects exhibited three distinctive "enterotypes" -- bacterial communities dominated by a particular bacterial genus Bacteroides, Prevotella or Ruminococcus.
Further studies of microbiomes from 13 Japanese and 4 Americans returned the same three clusters, suggesting the patterns are widespread and unconnected to ethnicity, age or gender. With such a limited sample size, however, containing no microbiomes from South Asia, Africa, South America and Australia, it remains to be seen whether other enterotypes exist. The researchers flatly admitted that they don't really know what to make of the groupings just yet, what their causes and effects honestly are, but are excited over the challenge of learning more.COMMENT ON ARTICLE
* THE HANFORD CHALLENGE: While nuclear power has clear attractions, it continues to be dogged by concerns over safety, long-term handling of radioactive waste products, and an inclination towards skyrocketing costs. As discussed by an article from THE ECONOMIST ("From Bombs To $800 Handbags", 19 March 2011), even before the fiasco at the Fukushima reactor complex in Japan, some of the difficulties with atomic power were clearly on display at the American nuclear development complex at Hanford, in Washington state.
The site was chosen during World War II because it could obtain large amounts of cooling water from the mighty Columbia River, as well as for its isolation in the sagebrush barrens of south-central Washington state -- despite the stereotype of Washington as rainy, much of the central part of the state is desert. Isolation helped with security and reducing the impact of accident.
No major accident took place up to the final shutdown of the last of Hanford's reactors in 1987, but operations at the site left a legacy of 200 million liters (53 million US gallons) of toxic radioactive waste stored in 177 increasingly decrepit underground tanks. Obviously something will have to be done with the waste before starts to poison the grounds and the Columbia River, but it's going to have to sit tight until 2019, when a "vitrification" plant will start sequestering the waste into glass logs. The logs will still be radioactive, but they will be much easier to handle and won't leak. However, it will take until at least 2047 to complete the job at a total cost of about $74 billion USD.
It may well take longer and cost more than that, since the US Defense Nuclear Facilities Safety Board, which oversees how the US Department of Energy (DOE) runs its nuclear-weapons sites, has not been pleased at what it has seen at Hanford. The board has been investigating allegations that the DOE has fired or pressured employees who spoke out about problems, while the DOE has accused the board of, in so many words, irresponsible meddling in the DOE's work. Whatever the merits of the case on each side, the dispute is likely to complicate the cleanup effort at Hanford.
Local citizens don't complain. The nearby towns of Richland, Pasco, and Kennewick -- known regionally as "Tri-Cities" -- were great beneficiaries of the Cold War and its drive to build bombs, bombs, and more bombs. Richland High School's team is the "Bombers", and the mushroom cloud motif still lingers on signs here and there. Less noticeably, the population of Tri-Cities features one of the highest ratios of PhDs in the USA, and that hasn't changed since the reactors were finally shut down. Now the area is being kept afloat by cleanup money, having shrugged off the recession, with environmental scientists the top dogs instead of nuclear engineers. Tri-Cities is making a good living both coming and going from nuclear power, and that isn't likely to change for at least another generation.
* NUKES DON'T PAY: A related article from TIME ("Why No Nukes? The Real Cost Of Nuclear Power" by Michael Grunwald, 25 March 2011) suggested that the talk of a revival of nuclear power was already fizzling out before Fukushima -- for the simple reason that few thought it was a good business deal.
The Obama Administration could hardly be described as anti-nuke, seeing nuclear power as the best way to provide power for a future America. The main other option, coal, is dirty in its own way, and also amounts to a heavy emissions contributor. Renewables are very sexy, but few honestly believe they can carry the power load on their own; while promoters of nuclear power can point to new designs, such as pebble-bed reactors, that are much safer than older systems. The White House has been able to make common cause with pro-nuke Republicans to push nuclear power; surprisingly, given how loud public protests over reactor construction were a generation ago, agitation against boosting nuclear power has been muted.
The problem is that US private capital hasn't been impressed. The American nuclear power industry ran down in the 1980s, with projects suffering from massive cost escalation that generally led to cancellations and defaults, and investors haven't forgotten that lesson. On a short-term basis, it's hard to justify building a nuclear power plant, since coal and gas are plentiful, and the recession has blunted demand for more power. Attempts by the Obama Administration to set up a "carbon tax" that would benefit nuclear against fossil fuel power went nowhere. Arguments over long-term benefits of nuclear power have limited ability to sway investors; they won't care about a long-term payoff if they are likely to go broke before they get it.
Of course, the controversial nature of nuclear power is a major factor in the tendency of costs for nuclear power plants to go through the roof, and the related sense of risk among investors. Advocates of nuclear power insist that the controversy is overblown, and maybe it is -- but it's there and showing no signs of going away. It is true that we are faced with a long-term energy crunch and don't have a handle on honestly persuasive solutions, but unfortunately, nuclear power doesn't look more persuasive than any of the others. Yes, we can certainly build safer reactors, though claims of perfect safety tend to inspire suspicion instead of confidence, and nobody seems to have honestly doped out the nasty problem of what to do with nuclear waste for the thousands of years required for it to cool off. There are certainly two sides to the arguments over the safety and environmental issues; but like it or not, there's no sensibly denying that nuclear power is a lousy business proposition.COMMENT ON ARTICLE
* ANOTHER MONTH: I ran across a blog of sorts that listed various offerings for USB flash drives, with some with nice items like mini-keychain configurations, secure storage, or ruggedized cases.
The page had some flash drive configurations as well, beyond the familiar "doll" formats, such as a black-ball bomb of silent-movie movie melodrama / comedy configuration, with a USB cable where the sputtering fuze is supposed to be; there are also USB drives in the form of miniature "pineapple" grenades. Others included:
One that I found very practical was in a charge-card format, so it could fit into a wallet. The card was hinged like a flip binder, just flip it open to reveal the connector, then shut it up and put it back in the wallet; I could use something like that.
* In other silly gimmick news, remember the "iCade" parody that went around on April Fool's 2010? A frame that turned an iPad into a classic arcade mini-console? Now ThinkGeek is selling it, at $100 USD. Since a classic games mini-console wouldn't be any fun without games, the iPad app store is offering a pack of 100 Atari games for $15 USD. The iCade does come with Pong, but I don't think that would satisfy the retro-game craving for very long.
I'm starting to get a craving for a tablet PC. Not just yet -- two or three years down the road, a 7-inch tablet may well be in the $150 USD range, and then I wouldn't be able to resist buying the toy and playing with it.
* I found myself slowly being overwhelmed by a flood of picky details to deal with and decided to get more organized. One thing to deal with, for example, was that my Amazon.com recommendations had become skewed and was no longer handing me items of interest. I went through and edited it; then I realized that I had rarely graded my past purchases with Amazon over the years, and so the system had no clear idea of my likes and dislikes. I went through my past purchases and graded all 800+ of them. That led to the realization that I had items on my Amazon.com wishlist gathering dust there for years; I decided to look over each item on the list and either drop it or buy it. Now I have an overlong pile of books to read.
Another item was tidying up power usage. My city utility organization just started sending out reports to individual clients on their power efficiency, and I got a top-rank grade except for electricity usage. That was likely because I spent most of the day at home, tinkering with my PC, and I decided to make sure I put my PC to sleep when I left it for a while. For the moment, I'm grinding through picky details efficiently, but it can't last. It helps to tidy things up, but after a time chasing after nitpicks just becomes fussing, a diversion from getting useful things done.COMMENT ON ARTICLE