< PREV | NEXT > | INDEX | GOOGLE | UPDATES | EMAIL | $Donate? | HOME

DayVectors

oct 2011 / last mod may 2016 / greg goebel

* Entries include: JKF assassination (series), future of world food production (series), examining functionality of bacterial genome, bacterial endosymbiosis in mealybugs, silicon carbide for power electronics, temporary halt in sea rise & shifting storm tracks, e-books becoming popular & Amazon.com introduces Kindle Fire, floating solar panels & Atacama solar power, furor over HPV vaccine, jute back in fashion, and satellite servicing programs.

banner of the month


[MON 31 OCT 11] NEWS COMMENTARY FOR OCTOBER 2011
[FRI 28 OCT 11] THE KILLING OF JFK -- THE BALANCE OF EVIDENCE (37)
[THU 27 OCT 11] SCIENCE NOTES
[WED 26 OCT 11] PROBING THE BACTERIAL GENOME / WHEELS WITHIN WHEELS
[TUE 25 OCT 11] SILICON CARBIDE FOR POWER ELECTRONICS
[MON 24 OCT 11] FEEDING THE NINE BILLION (6)
[FRI 21 OCT 11] THE KILLING OF JFK -- THE BALANCE OF EVIDENCE (36)
[THU 20 OCT 11] SPACE NEWS
[WED 19 OCT 11] NO RISING SEAS?
[TUE 18 OCT 11] E-BOOKS ON A ROLL
[MON 17 OCT 11] FEEDING THE NINE BILLION (5)
[FRI 14 OCT 11] THE KILLING OF JFK -- THE BALANCE OF EVIDENCE (35)
[THU 13 OCT 11] GIMMICKS & GADGETS
[WED 12 OCT 11] FLOATING SOLAR ARRAYS
[TUE 11 OCT 11] HPV VACCINE BATTLE
[MON 10 OCT 11] FEEDING THE NINE BILLION (4)
[FRI 07 OCT 11] THE KILLING OF JFK -- THE BALANCE OF EVIDENCE (34)
[THU 06 OCT 11] SCIENCE NOTES
[WED 05 OCT 11] JUTE MAKES A COMEBACK
[TUE 04 OCT 11] SATELLITE SERVICING GETS REAL?
[MON 03 OCT 11] ANOTHER MONTH

[MON 31 OCT 11] NEWS COMMENTARY FOR OCTOBER 2011

* NEWS COMMENTARY FOR OCTOBER 2011: The biggest news of the month was the death in action of Muammar Qaddafi on 20 October, putting an end to his dubious career as the whimsical dictator of Libya. He was 69 years old. Like him or not, he was true to character in death; he said he wouldn't give up as long as he was alive, and he was as good as his word. What comes after him in Libya remains an open question.

the last of Qaddafi

* Following up on the article last month on the CIA drone assault on Islamic terrorists in Afghanistan, BBC WORLD Online reports that the agency is quietly ramping up its attacks on Islamists in and around the Horn of Africa, essentially meaning Somalia and Yemen. That the Americans have been on the warpath there is not news. The US set up a base in the Horn of Africa in late 2002, putting nearly 1,000 troops into a former French Foreign Legion base at Camp Lemonier in Djibouti, north of Somalia, calling the organization the "Combined Joint Task Force / Horn of Africa (CJTF-HOA)". At that time, the CIA began to conduct drone strikes in the region, notably killing the al-Qaeda leader in Yemen, Abu Ali Al-Harithi, in November 2002.

Then drone operations there went quiet, only to resume with a literal vengeance as of late. There are new US bases in the Ogaden in Ethiopia and another in the Seychelles; the base in the Seychelles was set up two years ago, with a spokesman for the Pentagon's Africa Command (AFRICOM) saying that MQ-9 Reaper drone flights are being performed from there, but that these flights are for surveillance, the aircraft being unarmed. Rumors indicate that still another base will be set up in Yemen.

In Yemen's tribal provinces of Shabwa, Abyan and Marib, US drone strikes have definitely been inflicting pain on al-Qaeda in the Arabian Peninsula (AQAP). Of course there have been lethal mistakes in the strikes, due to faulty or out-of-date intelligence, which have tended to drive Yemeni tribesmen reluctantly closer to al-Qaeda. That only can help to further destabilize the regime of Yemeni President Ali Abdullah Saleh. However, Yemen is no stranger to war and violence, and for the time being the drone strikes seem likely to continue. Ironically, in 2008 US President George W. Bush promised African leaders there would be no escalation of the US presence in the region; obviously, the Obama Administration feels that matters now require a firmer hand.

* Preliminary skirmishing for the 2012 US presidential election continues, but I've only been glancing at it. The most interest I've had in it is through the "Bodog" gambling website, which has a page of betting odds on various presidential hopefuls. It's amusing to watch how dark-horse candidates emerge with a splash, to then gradually sink into the long odds as their credibility is examined and found wanting. Most of the focus is on the Republican primary race, for the simple reason that the Democrats already know who their candidate is. For the moment, Bodog odds on the next administration being Democrat or Republican are tilted slightly in favor of the Democrats.

Republican presidential hopefuls

In the meantime, the Obama Administration is pushing to get the US Supreme Court to judge on ObamaCare -- last discussed here early this year -- before the election. It makes sense, no point in letting the matter drag on, but there seems to be a division of opinion on whether a positive judgement will do Obama harm or good. Some believe the court action will simply aggravate attacks on ObamaCare; but on the other hand a court victory might strengthen Obama's position.

I'm puzzled as to why ObamaCare is so controversial. Yes, it does seem overbearing to tell everyone they must buy health insurance, but if the USA is to have universal health care -- and there seems to be a broad consensus that should be so -- then if the government doesn't give it to everyone, what else can be done but legislate that everyone get health insurance on their own and assist those who don't have the money for it? A universal health care system inevitably means universal funding, everybody pays, at least to their ability to do so; nobody can ride free. Very well, if everyone pays, is it preferable for citizens to pay directly for the services that they think best for them, or for the government to tax everyone and set up a monolithic health care system?

An argument could be made either way; but I have trouble figuring out any other options. To those who object to ObamaCare the only question is: "Then what do you want?" If they reply that they don't want universal health care, I would accept that as an answer, and have nothing more to ask.

COMMENT ON ARTICLE
BACK_TO_TOP

[FRI 28 OCT 11] THE KILLING OF JFK -- THE BALANCE OF EVIDENCE (37)

* THE KILLING OF JFK -- THE BALANCE OF EVIDENCE (37): As with the rest of the JFK assassination, conspiracy theories have grown up around the killing of Officer Tippit. There are a number of tales that he was actually part of the conspiracy and working with Oswald, with conspiracy theorists trying to make out his actions on 22 November 1963 as "mysterious".

No credible evidence suggests that Tippit was anything but a police officer on a patrol route that day. He was out of his usual patrol district, but since a good proportion of the force was casing out Dealey Plaza at the moment, the rest of the officers had to cover more area than usual. The Warren Commission performed an admittedly brief background check on Tippit, with the check including inspection of long-distance calls from the Tippit household in the weeks before the assassination, and found nothing suspicious. Nobody who knew or worked with Tippit noted that he gave any cause for suspicion, or that he was what anything but what he seemed to be, an honest and underpaid cop struggling to get by.

One story about Tippit's supposed involvement in the conspiracy came to light in 1982, with a report from J.W. Stark, the owner of the Top Ten Record Shop, in Dallas not far from the scene of the Tippit shooting, and his clerk Louis Cortinas, who claimed that on 22 November, Tippit came into the shop in an excited state at about 1:00 PM, tried to make a call on a pay phone, didn't get through, and left in a hurry, to be killed only minutes later. The two men in the record shop knew Tippit because he had been in the shop several times before. They didn't try to publicize the story, and both told the same tale.

However, they didn't tell anyone about it until the 1980s, and on inspection of the story Stark also claimed that Lee Harvey Oswald came into the shop at 7:30 AM -- at the time when Oswald was being driven to work by Wesley Buell Frazier. To them completely wreck his credibility, Stark claimed that not only did Tippit come into the shop on a regular basis, but so did Jack Ruby, and even Oswald and his family, despite the fact that the Oswald didn't have a car, couldn't drive, and had no spare money to spend on records.

There are claims Tippit was really trying to kill Oswald, but Oswald got the drop on him and hit him first. That is contradicted by the testimony of Helen Markham, who watched the whole scene and told the Warren Commission that Tippit took no hostile action against Oswald before Oswald shot him. Conspiracy theorists don't seem to have had qualms in portraying Tippit as a villain instead of murder victim, despite the fact that the "Tippit as killer" story is nothing more than an exercise in imagination. It might also seem at least a bit implausible that the conspiracy would have a cop in a patrol car, hardly an inconspicuous figure, perform a "hit" on a pedestrian in broad daylight on a public street. There are a number of variations on this story, some of which involve as an associate Roscoe White of the Dallas police -- as noted, fingered by his son Ricky as an assassin -- or even claim that White killed Tippit, not Oswald.

Another candidate proposed as an alternate "Tippit killer" was Igor "Turk" Vaganov, a 23-year-old Latvian immigrant and what would be now called a "neo-Nazi". He had arrived in Dallas in early November with his newly-married wife Anne. Only hours after the assassination of JFK, a juvenile probation officer from Conroe, Texas, had called up the FBI to report a story he had heard that Anne Vaganov had called her sister in Conroe the night before in a state of distress, saying that her husband was "up to something terrible." The FBI checked into it, examining Vaganov's rifle and 0.38 caliber revolver -- both of which he freely handed over to the authorities -- and determining from the manager of the apartment where the Vaganovs lived that the couple were around most of the day during the

That might have satisfied the FBI, but it didn't satisfy conspiracy theorists, who pointed to the fact that Vaganov didn't live all that far from the site of the Tippit killing; he owned a 0.38 caliber revolver; and he owned a red Ford Thunderbird, linking him to the "red Ford" Domingo Benavides said he saw at the scene of the killing. Further investigation in 1967 showed that Vaganov was at a bank at the time of the Tippit shooting and had paperwork from the bank transaction to prove it; he also wasn't packing the revolver, since he was still out of the apartment when Anne Vaganov heard a report of a police officer being killed nearby and took out the weapon for self-defense. As far as the "red Ford" went, as noted that would be eventually identified as Jack Ray Tatum's car.

What about the distraught phone call from Anne Vaganov? It turned out that the couple weren't getting along well -- the marriage would be short-lived -- and Anne feared her husband might hurt her or somebody else. Turk Vaganov actually later met with Helen Markham, who said that Vaganov wasn't the person who had shot Tippit; and Domingo Benavides, who said the same, but added that Vaganov seemed familiar. Vaganov said that Benavides seemed familiar as well, but the two couldn't place where they had met. Suggestions that the jacket dropped by the shooter was Vaganov's fell apart when it was determined that it was well too small for Vaganov. In any case, nobody ever found a trace of credible evidence that Vaganov was part of any conspiracy. [TO BE CONTINUED]

START | PREV | NEXT | COMMENT ON ARTICLE
BACK_TO_TOP

[THU 27 OCT 11] SCIENCE NOTES

* SCIENCE NOTES: The US National Aeronautics & Space Administration's (NASA) SOFIA flying observatory -- a 2.5 meter infrared telescope carried in a Boeing 747 jetliner and discussed here last year, the aircraft now formally known as the "Lowell Observatory" after famous astronomer Percival Lowell -- is now in service after a protracted development. On 23 June 2011, SOFIA demonstrated what it was capable of by performing a dash into the central Pacific observe "occultation" of a star by the passage of the minor planet Pluto. The change in brightness of the star as it winked out behind Pluto and then emerged again was measured by SOFIA's High-Speed Imaging Photometer instrument to obtain details of Pluto's atmosphere.

SOFIA observatory

Catching the occultation required that SOFIA fly 2,900 kilometers (1,800 miles) over the Pacific and had to be on location at precisely the right time. The observations were taken at SOFIA's operating altitude of 13,700 meters (45,000 feet), where the infrared absorption of the atmosphere is minimal.

* The sugar apple and its cousin the cherimoya are sweet tasty fruits of Latin America, not well known elsewhere because they're full of inconvenient seeds. Then one day a Spanish sugar apple farmer knocked on the door of researchers in Madrid with a mutant seedless sugar apple. Seedlessness is fairly common among domestic fruits -- grapes, watermelons, bananas -- and the new variant of sugar apple may lead to much greater consumer interest.

More importantly, the researchers from Madrid contacted genetics researchers at the University of California, Davis, with the two groups collaborating to determine the precise mutation that produced seedlessness. Seeds are usually important to fruit development because they produce hormones to promote fruit growth; in seedless fruits, fruit growth is promoted without the presence of seeds. The discovery of how this happens suggests the possibility of coming up with seedless varieties of fruits such as the cherimoya and the tomato. There is the irony in this, however, that while the mutant sugar apple arose naturally and so gives no cause for public objection, duplicating the same mutation in tomatoes would make it a genetically modified food and a target of suspicion.

cherimoya fruits

* As reported by BBC WORLD Online, tiny structures found in sandstones from Western Australia dating back 3.4 billion years suggest that life existed at that time. There was no free oxygen in the Earth's atmosphere in those days, and so the microorganisms seemed to have used sulfur instead of oxygen to support growth and metabolism. There are still plenty of oxygen-averse "anaerobic" bacteria around these days with a similar lifestyle.

Australian sandstone microfossils

The fossils were first identified in 2007 at Strelley Pool, a remote location of the dry region known as the Pilbara. Their host sandstones were laid down in what would have been a beach or estuary. The microfossils are spheres and ellipsoids just a few microns (millionths of a meter) across; since there are geological processes that can also construct spheres and ellipsoids, the microfossils had to be subjected to a series of tests to validate that they really were likely to be the remains of living organisms.

The shape and clustering of the fossil deposits are certainly reminiscent of bacterial cells, but more importantly, the fossils are associated with tiny crystals of "fool's gold", iron pyrite, composed of iron and sulfur. The isotopes of these elements present in the crystals point to the pyrite being formed as a by-product of cellular metabolism based on compounds of sulfur. Life processes tend to be biased towards lighter isotopes, so if the concentrations of light isotopes are high, it suggests bioprocesses at work. There are ways to get the same signature without biology, but it requires very high temperatures. The proximity of pyrites with light isotope signatures to the microfossils very strongly suggests they were microorganisms.

COMMENT ON ARTICLE
BACK_TO_TOP

[WED 26 OCT 11] PROBING THE BACTERIAL GENOME / WHEELS WITHIN WHEELS

* PROBING THE BACTERIAL GENOME: Researchers at the Stanford University School of Medicine have now cataloged exactly what parts of the genome of the bacterium Caulobacter crescentus are needed to keep it alive. They found that only 12% of the bacterial genome is essential for survival under lab conditions.

The bacteria studied is a non-pathogenic freshwater species that has long been used in molecular biology research. Its complete genome was originally sequenced in 2001. The essential elements found in the Stanford investigation included not only protein-coding genes, but also regulatory DNA and other small DNA segments of unknown function. The other 88% of the genome could be disrupted without harming the bacteria's ability to grow and reproduce.

The senior investigator on the project, Stanford biologist Lucy Shapiro, commented: "There were many surprises in the analysis of the essential regions of Caulobacter's genome. "For instance, we found 91 essential DNA segments where we have no idea what they do. These may provide clues to lead us to new and completely unknown bacterial functions."

Caulobacter's genome, like that of most bacteria, is a single, ring-shaped chromosome. To perform their experiment, the researchers altered many Caulobacter cells so that each cell incorporated one piece of artificial DNA at a random location in its chromosome. The artificial DNA, labeled so the scientists could recognize it later, disrupted the function of the region of bacterial DNA where it landed. Over two days, the researchers grew these mutants until they had about 1 million bacterial cells, and then sequenced their DNA.

After computer analysis, the researchers created a detailed map of the entire bacterial genome to show exactly where the artificial DNA segments had been inserted in the chromosome of the surviving cells. This mutation map contained many gaps -- the regions of the DNA where no living bacteria had survived with an artificial DNA insertion. These regions, the researchers reasoned, must be essential for bacterial life since disrupting them prevented bacterial survival. Said Shapiro: "We were looking for the dog that didn't bark."

In total, the essential Caulobacter genome was 492,941 base pairs long and included 480 protein-coding genes that were clustered in two regions of the chromosome. The researchers also identified 402 essential promoter regions that increase or decrease the activity of those genes, and 130 segments of DNA that do not code for proteins but have other roles in modifying bacterial metabolism or reproduction. Of the individual DNA regions identified as essential, 91 were non-coding regions of unknown function and 49 were genes coding proteins whose function is unknown.

* WHEELS WITHIN WHEELS: In loosely related news -- mealybugs are somewhat nondescript little insects that survive by sucking the sap from plants. Other than being plant pests, they wouldn't seem to be of much interest, but it turns out that they have intriguing adaptations to their lifestyle.

citrus mealybugs

Microbiologist John McCutcheon of the University of Montana and biologist Carol von Dohlen of Utah State University investigated the citrus mealybug, finding out that, since it can't get all the amino acids it needs to stay alive from plant sap, it relies on a symbiotic bacteria, Tremblaya princeps, to generate those nutrients for its host. That's not so unexpected, many animals can't survive without help from symbiotic microbes. Says McCutcheon: "Some sap-feeding insects form stable relationships with one, two or even sometimes more symbionts to provision them with essential amino acids. These bacteria live only within special insect cells, which form special organs in the insect exclusively to house these bacteria."

What is surprising is that Tremblaya can't survive on its own, either. It needs help from its own symbiotic bacteria, Moranella endobia. Tremblaya also has an unusually small genome, only 121 genes, compared to the 20,000 or so of the human genome. It is very unlikely that a bacterium would be able to stay alive with such a small genome; and it seems the only way Tremblaya can get away with so few genes is because of its dependence on Moranella, which performs the "missing" functions. This is how evolution works: genes that are no longer necessary for the survival of an organism tend to spontaneously break and disappear.

The relationship between the two bacteria suggests that ultimately Moranella will become a cellular organelle, part of Tremblaya. The mitochondria in our own cells, and the chloroplasts in plant cells, have their own reduced genomes and are generally seen as derived from symbiotic relationships with once-independent bacteria. As demonstrated by the mealybug's inhabitants, the process of assimilation continues.

COMMENT ON ARTICLE
BACK_TO_TOP

[TUE 25 OCT 11] SILICON CARBIDE FOR POWER ELECTRONICS

* SILICON CARBIDE FOR POWER ELECTRONICS: Silicon has become the material of choice for fabricating electronic devices, but it has limitations, particularly when it comes to handling high power levels. As reported by an article from IEEE SPECTRUM ("Silicon Carbide: Smaller, Faster, Tougher" by Burak Ozpineci & Leon Tolbert, October 2011), after a long learning curve an alternate material known as silicon carbide (SiC) is beginning to move into that niche.

SiC diodes have already started to replace silicon devices in some applications, and SiC transistors are starting to come onto the market. SiC wafer manufacturers have steadily reduced the level of defects while increasing the wafer size, driving down the prices of SiC devices. Within a decade, SiC devices will be commonplace in hybrid and electric vehicles and the emerging "smart" power grid.

Silicon carbide's usefulness traces primarily down to one property, its "bandgap". Typically metals are conductors; if the energies of electrons in a metal are plotted vertically, below a certain energy, or equivalently voltage, electrons will not flow through the metal, remaining in the "valence" band; above that threshold they will conduct through the metal, occupying the "conduction" band. In semiconductors like silicon, it's not so simple: there's a "bandgap" separating the valence and conduction bands, meaning electricity won't flow until the voltage is high enough to "jump" the bandgap. The bandgap is distinctive for semiconductor materials; with SiC, it's about three times higher than it is for silicon.

The main advantage of this big bandgap is that SiC resists device breakdown under high voltages. Silicon devices can't withstand electric fields in excess of about 300 kilovolts per centimeter. Anything stronger will pull on flowing electrons with enough force to knock other electrons out of the valence band. These liberated electrons will in turn accelerate and collide with other electrons, creating an avalanche of electric current that eventually destroys the device. Valence electrons in SiC require more energy to be pushed into the conduction band, with the result that SiC devices can withstand about ten times the voltage gradient that would destroy a similar silicon device. On the other side of that same coin, an SiC device operating at low voltages can be a tenth the thickness of a comparable silicon device, resulting in a faster device with lower resistive losses.

Silicon power devices have also tended to be focused on bipolar transistors, which use both electrons and "holes" -- vacancies of electrons in the crystalline lattice -- to conduct current, giving them more current-carrying capability. However, bipolar devices are slow, taking too much time to switch on and off. SiC permits use of unipolar devices, in the form of "field-effect transistors (FETs)", which have much faster switching speeds.

If SiC is such a wonder material, then why isn't it in wide use already? Because it's hard to work with. When development of SiC wafers began in the 1970s, it proved difficult to grow single crystals of the material, since silicon and carbon don't tend to crystallize in a nice neat regular arrangement. By 1991, however, a US startup named Cree was able to release the first commercially available SiC wafers. Other manufacturers have jumped in, with the industry gradually increasing the size of SiC wafers, allowing more devices to be fabricated on one wafer. 4-inch SiC wafers are now common, with 6-inch wafers in the wings.

Another problem with SiC crystals is defects. Unlike silicon, SiC doesn't have a molten liquid phase, and so SiC crystals are grown layer by layer from vapor at a temperature of about 2500 degrees Celsius. Vapor deposition is difficult to control and can often create tiny, tornado-like tunnels called "micropipes" arising from defects in the crystal early in the wafer formation process. Devices built on top of micropipes don't work right; even a few micropipes per square centimeter is enough to seriously reduce chip yields. The micropipe problem is being addressed: in 2005 Intrinsic Semiconductor Corporation, now part of Cree, introduced 3-inch SiC wafers with no micropipes. 4-inch micropipe-free wafers are now on the market.

The first SiC semiconductor device to be commercially introduced was a Schottky diode, a fast rectifying device with one half made of semiconductor and the other of metal, introduced by Infineon of Germany in 2001. The SiC Schottky diodes were expensive but much more robust than silicon Schottky diodes. SiC Schottky diodes have now largely replaced silicon Schottky diodes in power applications. SiC Schottky diodes are now available that can handle as much as 1700 volts, about five times more than the limit for their silicon counterparts.

Infineon power Schottky diodes

Diodes are fine, but power electronics runs on transistors, and it's taken a little more time to construct them with silicon carbide. The first commercial SiC transistors were "junction FETs (JFETs)", made by Mississippi-based SemiSouth Laboratories and introduced in 2006. A range of different SiC transistor configurations are now being offered by Cree, Infineon, and other SiC fabricators.

SiC devices are seen as particularly important in power control systems for electric and hybrid cars, which demand highly efficient use of energy. Not only are SiC power control systems well more efficient, thanks to the high bandgap of SiC, they're much more heat-tolerant. Silicon devices will fail at about 150 degrees Celsius, but SiC devices can tolerate more than twice that temperature. That makes SiC particularly well suited for rugged-environment applications, such as military systems and electronics for oil wells, geothermal plants, and spacecraft -- though that also implies the development of support components, such as capacitors, that can also take the heat. As far as electric and hybrid cars go, SiC's high thermal tolerance translates into less need for liquid cooling to keep the temperature of power electronics modules down.

SiC devices are already proving useful in power converters for solar power arrays. Solar arrays usually need converter modules to turn their DC electricity into AC electricity that can mesh with the power grid. Silicon converter systems are already efficient, with no more than 3% losses, but SiC can easily cut those losses in half -- which may not sound like much, but for a large solar array that will operate for decades, the savings in power could easily add up to hundreds of thousands of dollars.

Ultimately the use of SiC across the board, from high-voltage power systems to household consumer gear, coupled to smart power control systems, will provide a clear jump in energy efficiency -- not enough to change the basic economic landscape for energy, but more than enough to pay for itself. That will require further growth in the capability of SiC devices along with reductions in cost, but those working in the business feel they have good reason to believe such things will happen.

COMMENT ON ARTICLE
BACK_TO_TOP

[MON 24 OCT 11] FEEDING THE NINE BILLION (6)

* FEEDING THE NINE BILLION (6): As noted in the previous installment, genetic technology is proving a significant factor in animal husbandry. In fact, it's almost unarguably the most important factor in preparing 21st-century agriculture to feed the world in 2050. The key is "marker-assisted breeding (MAB)" -- that is, the ability to read the genetic "markers" for various traits in crop plants, then selectively breed them, possibly with some genetic alteration, to generate the desired traits.

It wasn't so long ago that we really only knew how crop plants behaved by watching them grow; now we are getting an ever-improving understanding of how they work. For example, plants have an ability to "remember" how long winter has been going on, so they don't mistake a mild spell in January for spring. There is actually a regulatory gene that controls the "memory", with the process finely tuned to local environments: the gene revives the plant a month later in Sweden than the same plant does in southern England.

Cost of identifying genes has been dropping rapidly. Not too long ago, it cost about $2 USD to nail down one gene in a plant; right now it's about 15 cents, and current work should allow 30 genes to be spotted for a penny; cost is not going to be a constraint for MAB in the near future. The constraint is, of course, the public resistance to genetically modified foods, with opposition to GM being particularly strong in European countries -- but that's not as big a constraint as it might seem. Crop plant populations have considerable genetic diversity on their own; just being able to select "natural" traits among that diversity for exploitation provides an enormous benefit without use of GM.

The genetic diversity of a population of crop plants is like a huge library. MAB not only provides a catalog for the library, it provides a table of contents and an index for each book, allowing traits to be easily tracked down, as well as sorted and cross-indexed. Agritech giant Monsanto has a "corn chipper" which takes a tiny sample of genetic material and quickly generates a genetic profile for that seed, with the profile going into a database with other seeds. The seed is not harmed by being "chipped" and can still be bred.

Thanks to MAB, Monsanto has been moving from strength to strength. In 1997, Monsanto introduced a new variety of corn that was resistant to various pests; it fully controlled 4 of 15 common "above-ground" pests like corn borers, cutworms, and stinkbugs, while partly dealing with three more. The 2004 successor variety of corn controlled 9 of the 15 above-ground pests and 7 of the 8 soil pests, while the 2010 variant partly dealt with three more soil pests. What is particularly encouraging about the potential of MAB is that it has been barely tapped. Only a small portion of the libraries of crop plant diversity is available at present, but genomic data is pouring in at an increasing rate. Up to this time, companies like Monsanto have come up with improved varieties by focusing on one or two traits, such as traits for disease resistance. In the future, they will be able to generate new varieties on the basis of sets of genes with complicated interactions.

That promises crop plants optimized for all desired traits. The ability to optimize over a matrix of genes has a bonus: resistance to disease or pests based on an interlocking set of traits means a more robust plant, because the assailants will have to evolve around multiple traits, not just one, to defeat plant defenses, staving off the exasperating evolutionary "Red Queen's Race". However, the most important single factor is yield, which will have to increase by at least 1.5% a year over the next few decades to ensure that the world of 2050 can be fed.

So far, corn is one of the few crop plants on that track. Other crops such as wheat, rice, and soybeans need to be brought up to speed. Wheat has been a notable laggard, partly because it is genetically complicated, a "polyploid" hybrid of three progenitor grasses, giving it six sets of chromosomes instead of the two sets that we possess. Work on wheat has been picking up, with new strains in the works.

While GM isn't the whole story for greatly improved lines of crop plants, it still could play an enormous role, if political obstacles can be overcome. One particularly interesting line of research has been towards genetic modification of wheat using genes from "leguminous" plants, such as peas and beans, to allow wheat to make its own fertilizer. Legumes feature nodules on their roots that provide homes to "nitrogen-fixing bacteria" that extract nitrogen from the soil and convert it into ammonia. That would mean largely self-fertilizing plants. Researchers think it may take decades to accomplish, but the idea of self-fertilizing wheat and other crop plants is incredibly attractive.

The problem is, again, the fear of GM foods. The huge potential of GM and the demands of feeding a growing world population give GM an irresistible momentum, making a rigid ban as impossible over the long run as holding back the ocean. What is actually needed is a well-defined regulatory environment, where GM plants can be introduced after proper qualifications.

* As a footnote, what role will agritech such as automated indoor farming play in the future of food production? That's hard to say. There's nothing actually particularly new or unconventional about indoor farming using hydroponics or aeroponics, but the extent to which it supports food production now or could in the future is unclear. It tends to be capital-intensive, though it's also less subject to weather conditions, with little need for pesticides and much greater ability to recycle nutrients.

Notions of "skyscraper farms" are a little hard to buy, since they tend to be energy-intensive -- but other schemes, such as an upper-storey greenhouse on top of a supermarket to generate fresh produce, seem intriguing. Such notions may not be a major factor in solving world hunger, but they may represent a good business proposition that brings in a profit and gives consumers what they want. Much the same could be said for other notions such as aquaculture, another established practice growing in influence, and more unorthodox ideas such as raising insects in bulk for protein meal in animal feed. There's a place for experimentation, and an idea doesn't have to save the world to fly -- it just needs to make business sense. [TO BE CONTINUED]

START | PREV | NEXT | COMMENT ON ARTICLE
BACK_TO_TOP

[FRI 21 OCT 11] THE KILLING OF JFK -- THE BALANCE OF EVIDENCE (36)

* THE KILLING OF JFK -- THE BALANCE OF EVIDENCE (36): Along with the picky matter of the cartridge cases found at the scene of the Tippit shooting, there's the issue of a wallet found there. In 1996, ex-FBI Agent James Hosty claimed that Dallas police Captain W.R. Westbrook had found a wallet near the puddle of blood where Tippit had been killed, and determined it was Oswald's wallet. Hosty learned this because Westbrook showed it to FBI Agent Robert Barrett, who told Hosty about it in 1995. That didn't square with the official story, that the Dallas police took Oswald's wallet from him at the Texas theater.

Dale Myers interviewed Barrett in 1996; Myers couldn't talk to Westbrook, since he had died earlier that year. Barrett said that what had specifically happened was that Westbrook was holding a wallet, and that Westbrook asked him if he knew who Lee Harvey Oswald or Alek Hidell were -- so Westbrook simply assumed that the wallet must have been Oswald's. Silent video footage taken by local TV station WFAA-TV did show police inspecting a wallet at the crime scene. However, none of the police who showed up in the immediate aftermath of Tippit's killing noticed the wallet, and if it had actually been Oswald's, the police would have found it extremely interesting. It was likely Tippit's wallet, but since all there was to go in the matter was Barrett's belated testimony and the ambiguous video footage, the wallet amounts nothing more than a thin straw that's disappeared into the wind.

* One of the other discrepancies of the Tippit shooting was the fact that the initial police report said the suspect was carrying an automatic pistol, not a revolver. That was because the police had found the empty cartridge cases. Cases are ejected from an automatic pistol after being fired, but they have to be manually removed from a revolver, and experience suggested to the police that it was very unlikely a shooter with a revolver would take time to reload while fleeing the scene of a crime. Conspiracy theorists point out that cartridge cases for a 0.38 automatic are marked "38 AUTO" and the police should have known that -- but not only is it plausible that the police weren't aware of this factoid, the fact that they didn't identify the cases as marked as "38 AUTO" and still thought an automatic had been used in the shooting suggests they had misinterpreted matters.

Conspiracy theorists also play up that Ted Callaway said the shooter was carrying an automatic pistol, though it's unclear where he did so: in his deposition on 22 November, in his Warren Commission testimony, and in a 1964 TV interview for CBS News he said nothing about the configuration of the pistol. It appears to have been a casual remark to the police on 22 November 1963. Callaway did admit to saying such in a 1996 interview by Dale Myers, and explained why:

BEGIN QUOTE:

When I saw [Oswald] ... he had his pistol in a raised position [in his right hand] and his left hand going to the pistol ... And [when I was in the Marines and handling a Colt 45 automatic] I used that same motion before in pushing a loaded magazine up to the handle of a 45. ... And so, when they asked me what kind of gun that he had I told them it was an automatic; on account of that motion.

END QUOTE

That was all there was to it: Oswald slapped the grip of the pistol with his free hand, possibly to emphasize he was armed and dangerous, and Callaway thought Oswald was shoving a magazine into the pistol, something not done with a revolver. Callaway obviously didn't think the matter was significant, given that he didn't mention it in his various recorded formal testimonies on the matter. The facts suggest a simple case of mistaken weapons identity, and the "0.38 automatic" tale ends up in the same category as the "Carcano was a Mauser" story. [TO BE CONTINUED]

START | PREV | NEXT | COMMENT ON ARTICLE
BACK_TO_TOP

[THU 20 OCT 11] SPACE NEWS

* Space launches for September included:

-- 10 SEP 11 / GRAIL -- A Delta 2 7420 Heavy booster was launched from Cape Canaveral to put the dual-spacecraft "Gravity Recovery & Interior Laboratory (GRAIL)" mission into space. GRAIL was to obtain a high-resolution gravity map of the Moon.

GRAIL was derived from an Earth satellite system, the "Gravity Recovery And Climate Experiment (GRACE)", launched in 2002. GRACE consisted of two satellites, one following the other in the same orbit, connected by a precision microwave link; measurements of the variations in the flight path of the two spacecraft provided data for construction of an extremely precise gravitational map of the Earth, mostly to track the shifting of the Earth's oceans.

GRAIL

Like GRACE, the two GRAIL spacecraft flew in the same orbit in tandem, remaining in contact over a precision microwave measurement link the "Lunar Gravity Ranging System (LGRS)", operating in the Ka band and able to measure the distance between each other to about the width of a red blood cell. To obtain reliable measurements meant that the spacecraft had to be tracked closely from Earth, and so both were fitted with a radio beacon to ensure precise tracking.

Each GRAIL satellite had a launch mass of 307 kilograms (677 pounds) and was about the size of a household clothes washer. They were loosely based on the "Experimental Satellite System 11 (XSS-11)", built by Lockheed Martin for the US Air Force Research Laboratory and launched in 2005. The GRAIL spacecraft had twin solar arrays for power and a hydrazine rocket / thruster maneuvering system; they featured thermal protection systems to prevent solar heating from throwing off their measurements. The two looked identical, but they were "handed", being "mirror images" of each other.

The two GRAIL orbiters were launched on a circuitous flight path that took them around the Earth-Sun Lagrange point, resulting in a trip to the Moon that took about three and a half months. The roundabout trajectory was to get the two spacecraft into lunar orbit with a minimum of fuel, reducing launch mass so a smaller and cheaper booster could be used. The extended trip to the Moon also allowed the spacecraft to shed residual traces of gases that could have affected their measurements. The GRAIL spacecraft were placed in orbit at an altitude of 51.5 kilometers (34 miles), with separation varying from 100 kilometers (62 miles) to 225 kilometers (140 miles). Wider separation permitted mapping more deeply into the Moon. Mapping the Moon took about three months; the gravity map was of better resolution than that provided by the GRACE satellites, since the GRAIL spacecraft could orbit the Moon at lower altitudes.

The only secondary payload was a "MoonKAM", a four-camera fixed imager array for outreach to middle school students. Students were able to request that MoonKAM take images of particular lunar features; the outreach program was run by ex-astronaut Sally Ride.

-- 18 SEP 11 / CHINASAT 1A -- A Long March 3BE booster was launched from Xichang to put the "Chinasat 1A" geostationary comsat into orbit. ChinaSat 1A was probably based on the DFH-4 comsat bus and had a launch mass of 5,215 kilograms (11,500 pounds). Not much was announced about it, suggesting that it was a military comsat.

-- 20 SEP 11 / GARPUN 1 (COSMOS 2473) -- A Proton M Briz M booster was launched from Baikonur in Kazakhstan to put a secret Russian military satellite into orbit. It was believed to the first "Garpun (Harpoon)" communications satellite, intended to relay data from Russian surveillance satellites, replacing the older "Geizer" satellites. The spacecraft was designated "Cosmos 2473".

--22 SEP 11 / ARABSAT 5C, SES 2 -- An Ariane 5 ECA booster was launched from Kourou in French Guiana to put the "ArabSat 5C" and SES ASTRA "SES 2" geostationary comsats into orbit. Arabsat 5C was built by Astrium of France and was based on the Eurostar 3000 satellite platform, with Thales Alenia Space providing the satellite's payload. The spacecraft had a launch mass of 4,618 kilograms (10,183 pounds) with the payload including four antenna reflectors plus 26 C-band / wideband transponders, the wideband transponders operating in ten sub-bands across the Ka band. Arabsat 5C had a design life of 15 years; it was placed in the geostationary slot at 20 degrees East longitude to provide communications services to the Middle East and Africa.

SES 2 was built by Orbital Sciences Corporation and was based on the Orbital GEOStar 2.4 satellite bus, the company's biggest. It had a launch mass of 3,152 kilograms (6,950 pounds), and carried a payload of 24 C-band / 24 Ku-band transponders. It was placed in the geostationary slot at 87 degrees West longitude to provide communications services to customers in North America. SES 2 was one of three almost identical satellites built by Orbital for SES and launched in 2010 and 2011. SES 2, however, also carried the US Air Force's "Commercially Hosted Infrared Payload (CHIRP)" infrared surveillance payload. CHIRP featured a wide field-of-view staring sensor to pick up infrared signatures of missile launches, as well as explosions and fires; it was built by Science Applications International Corporation (SAIC) of Virginia.

--24 SEP 11 / IGS 6A -- A JAXA H-2A booster was launched from Tanegashima to put the "Information Gathering Satellite (IGS) 6A" electro-optical spy satellite into orbit.

--24 SEP 11 / ATLANTIC BIRD 7 -- A Sea Launch Zenit 3SL booster was launched from the Sea Launch Odyssey platform to put the "Atlantic Bird 7" geostationary comsat into orbit. The satellite was built by EADS Astrium and was based on the company's Eurostar 3000 bus. It had a launch mass of 4,600 kilograms (10,150 pounds), carried a payload of 44 Ku-band transponders, and had a design life of 15 years. It was placed in the geostationary slot at 7 degrees West longitude to provide communications services to the Middle East and Africa.

Tacsat 4

--28 SEP 11 / TACSAT 4 -- A Minotaur 4 booster was launched from Kodiak Island in Alaska to put the "TacSat 4" experimental military comsat into orbit. The 450 kilogram (1,000 pound) satellite was placed in a highly elliptical orbit to provide communications for tactical warfighters. TacSat 4 was designed by the Pentagon's Operationally Responsive Space (ORS) organization and implemented by the Naval Research Laboratory. The satellite carried a payload of 12 UHF transponders, communicating using a 3.65 meter (12 foot) antenna. The elliptical orbit permitted warfighters a line of sight to the satellite even when they were down in deep valleys; four TacSats could provide continuous coverage to a specific battle theater.

--29 SEP 11 / TIANGONG 1 -- A Long March 2F booster was launched from Jiuquan to put the "Tiangong 1" mini space station into orbit. The spacecraft had a launch mass of 8,615 kilograms (19,000 pounds), a length of 10.4 meters (34 feet 1 inch) and a maximum diameter of 3.35 meters (11 feet). The Tiangong ("Heavenly Palace") station featured two modules:


Tiangong 1

Tiangong 1 was intended mostly for technology validation and had a design lifetime of two years. The booster was the "2F T1" Long March subvariant, with larger liquid-fuel strapons and a bigger payload shroud than a standard 2F booster. The "Shenzhou 8" manned space capsule was to follow the station into orbit to deliver a crew about a month later.

--30 SEP 11 / QUETZSAT 1 -- A Proton M Breeze M booster was launched from Baikonur in Kazakhstan to put the SES "QuetzSat 1" direct-broadcast TV geostationary comsat into orbit. QuetzSat 1 was built by Space Systems / Loral and was based on the SS/L 1300 satellite platform. The satellite had a launch mass of 5,600 kilograms (12,350 pounds), a payload of 32 Ku-band transponders, and a design life of 15 years. The spacecraft was placed in the geostationary slot at 77 degrees West longitude to provide direct-to-home communications services to Mexico.

* OTHER SPACE NEWS: The NASA Upper Atmosphere Research Satellite (UARS) was put into orbit by shuttle Discovery in September 1991; after a few days more than 20 years in space, the bus-sized UARS finally fell back to Earth on 23 September. There were concerns because the reentry was uncontrolled and there was no telling where the 5,900 kilogram (13,000 pound) spacecraft might come down, but the likelihood of much of it surviving reentry and hitting anyone on the ground was small. It fell into the Pacific south of the Equator, northeast of Samoa.

There was a bit of fuss about the matter, but not as much as that associated with the fall of the much larger NASA Skylab space station in July 1979. Nobody was hurt by Skylab's crash either, but a chunk of it did fall in Australia, where local authorities fined NASA for littering. NASA estimated the chances of anyone being injured by the fall of UARS as 1 in 3,200 -- which some mathematically naive folk suggested was a lot better odds than winning the lottery. No, the odds of somebody winning the lottery are 1, somebody's going to win; the odds of any one specific owner of a lottery ticket winning are very small -- but still huge compared to the probability of being hit by a chunk of space debris.

Just to compound matters, this month later the German DLR space agency announced that their ROSAT (Roentgen Satellite) space observatory was due to perform another uncontrolled reentry. Although ROSAT, at 2,400 kilograms (5,300 pounds), was much smaller than UARS, its subassemblies were more robust and up to three times more debris from ROSAT was expected to hit ground than from UARS. The DLR accordingly gave higher odds of someone getting hit: 1 in 2,000. It reentered over the Bay of Bengal on 23 October, causing no damage.

COMMENT ON ARTICLE
BACK_TO_TOP

[WED 19 OCT 11] NO RISING SEAS?

* NO RISING SEAS? The case for climate change is heavily based on observations that are difficult to assess, and climate being a noisy phenomenon it's not always easy to determine what the observations actually reveal. Consider the issue of sea-level rise. Given a warming climate, the oceans should be clearly rising, not just because of the melting of ice sheets but because warmer water expands slightly.

For the past 18 years, the US/French Jason-1, Jason-2 and Topex/Poseidon satellites have been monitoring the gradual rise of the world's ocean in response to global warming. The rise has been remarkably steady during this period, but this last year, global sea levels have fallen about half a centimeter. The backtracking may be slight, but it's still there: does it mean that climate change is an illusion?

rising seas?

NASA climate scientists say NO, identifying the cause of the decline as due to the Pacific El Nino and La Nina oceanic cycles. Although 2010 began with a substantial El Nino, by year's end it was replaced by one of the strongest La Ninas in recent memory. This sudden shift in the Pacific changed rainfall patterns across the world, bringing massive floods to regions such as Australia and the Amazon basin, while afflicting the southern USA with drought.

Gravity maps from the NASA/German Aerospace Center's twin Gravity Recovery and Climate Experiment (GRACE) satellites provide a clear picture of how this extra rain fell onto the continents in the early parts of 2011. Of course, the extra water that fell on land had to mostly come from the oceans; the increment of rainfall was actually enough to make the seas fall slightly. Few think the decline will last very long, the long-term trendline in sea levels being persistently upward.

* In related climate news, it is known that mid-latitude storm tracks account for the bulk of precipitation in the Earth's middle latitudes, which includes most of the heavily populated areas of North America, Eurasia, and Australia. Atmospheric dynamics mean that these storm tracks tend to recur in the same locations. Some climate models have predicted that the positions of these storm tracks would slowly migrate toward the poles, but so far this trend has not been observed. However, analysis of 25 years worth of data from the "International Satellite Cloud Climatology Project (ISCCP)" now indicates that this shift is probably now taking place.

The ISCCP operates a network of Earth observation satellites that have been collecting data on clouds since 1983. A team of researchers carefully analyzed data for Northern and Southern Hemisphere storm tracks in the Atlantic and Pacific Oceans to look for trends; the Indian Ocean couldn't be included because of issues with satellite coverage. The results indicated a slight poleward shift of the storm tracks. The satellites employed by the ISCCP do have known data issues -- measurement changes when new satellites came online, lower data quality at the "seams" between coverage from different satellites, and so on -- so the authors used several different analysis techniques to see if they got different results. All gave a poleward trend, though they varied in rates.

Where the study gets more interesting is that satellite data also shows a roughly 2% to 3% reduction in global cloud cover since 1983. This occurred through a large decrease in low-level cloudiness, and it came despite a slight increase in high-level cloudiness. The action of clouds is one of the most heavily argued elements in climate models, but it is generally believed that high-level cirrus clouds provide an enhanced "greenhouse effect", with thin cirrus clouds allowing sunlight to reach ground but the water vapor trapping re-emission of infrared thermal radiation; while thicker low-altitude clouds tend to reflect sunlight. Increasing high-altitude cloudiness and decreasing low-altitude cloudiness should then skew towards warming.

The study was cautious about drawing sweeping conclusions, in particular pointing out that the most significant data involved was taken from the limits of the satellite observations, meaning there's uncertainty about the levels of signal versus noise. What is unambiguous is that more and better satellite data is essential to understand climate change.

COMMENT ON ARTICLE
BACK_TO_TOP

[TUE 18 OCT 11] E-BOOKS ON A ROLL

* E-BOOKS ON A ROLL: It isn't news that Amazon.com's Kindle e-book reader has been a big hit; what is more news, as reported by THE ECONOMIST ("Great Digital Expectations", 10 September 2011), is that the e-book technology is, after a long gestation, proving transformational. Sales of e-books are creeping ahead of those of print books, with Amazon now selling more e-books than print books. The trend is likely to continue, increasingly pressuring bookshops, the recent liquidation of the American Borders chain of bookstores being dramatic evidence of the shift.

Books are arriving in the digital world in the tracks of music and video, the industries for which suffered considerable pain in the transition. Book publishers believe their prospects are better, but digital media still presents challenges. E-books certainly have advantages for publishers, since production costs are low, and they effectively eliminate inventory problems: books are always in print, always in stock, and there's no glut of unsold books. Crime blockbusters and romance novels are doing particularly well in digital media, possibly in part because nobody feels embarrassed to be reading a trashy bodice-ripper on a Kindle while riding a subway. Romance novels have short shelf lives and rapid turnover of titles, traditionally making distribution a big headache that is greatly reduced by digital distribution. Harlequin, a major publishers of romances, now has 13,000 digital titles and is now even introducing digital-only works.

Unfortunately, the e-book business is just as vulnerable as music and video in significant ways, the most visible problem being of course piracy. Readers tend to be more inclined than not to share e-books with friends, a process which can start out as harmless as loaning out a book and end up as undisguised piracy. E-book piracy is already out of control in Russia.

There is, however, a bigger problem: chaos in e-book pricing. When Amazon began pushing e-books, it charged only $9.99 USD for many titles, selling at a loss sometimes to push Kindle sales. It gradually became obvious that such low prices were undermining the perceived value of all books, customers beginning to see anything more expensive as overpriced. With the introduction of the Apple iPad in 2010, book publishers were able to prevail on Amazon to allow the publishers to set prices, with Amazon obtaining a 30% markup.

However, the new policy hasn't actually worked out well. Instead of enforcing rationality, the new pricing policy ran into a rush by authors lacking publishers to put their product on the e-book market at bare-bottom prices, as well as publishers trying to get an edge by selling at steep introductory discounts. Now the Amazon best-seller list is crowded with cheap titles, with prices as low 99 cents. There's something inspiring about such product democracy, but it's much more a benefit to consumers than producers. Publishers say that they've always charged a range of prices for books -- hardbacks at the top of the line, high-quality paperbacks in the middle, and small mass-market paperbacks at the bottom. The problem is that there is no such product differentiation for e-books; there's no sensible way to make one edition of an e-book appear nicer than another, except possibly by adding extras as with DVDs, and there's no big push to do that yet.

There's a third problem, in the form of Amazon's market dominance. Amazon controls about three-quarters of the e-book market in the USA, and a whopping 90% of the market in the UK. Barnes & Noble is giving Amazon some competition in the USA with its Nook reader, but Apple's iBookstore for the iPad appears to be a nonstarter, with few iPad owners finding it handy as an e-book reader; most of them want to buy a Nook or Kindle for that purpose. As described here earlier, the Nook is basically a cheap tablet, and once Amazon emulates it, it may take more market share away from Apple than the reverse, as the e-book readers get more apps and start looking more like general-purpose handy computers. Leaving no stone unturned, Amazon is also getting into the publishing end of the business, starting up a romance label and commissioning works from popular authors.

The final problem with e-books is: what happens to bookstores? The bookish among us have always liked browsing the book racks, and though Amazon's elaborate recommendations system is impressive, it doesn't really replace the experience of picking up a book and page through it. Efforts are underway to create social networking sites for book-lovers that may provide a better experience. Whatever happens, the centuries-old kingdom of the book is now facing rapid and dramatic change.

Kindle Fire

* ED: As anyone who follows tech news is aware, Amazon has now dropped "the other shoe" and announced its "Kindle Fire" tablet e-book reader, to be introduced in mid-November for $199 USD. It has a multitouch color 7-inch display with a resolution of 1024 x 760 pixels, dual-core processor (almost certainly ARM), 8 gigabytes of flash, headphone jack, plus wi-fi and USB connectivity. No mention of a SD card slot, and it would be nice to have a mike input for taking audio notes. Some critics have complained about the 8 GB capacity, saying it's not enough to store adequate amounts of video, but Amazon is promoting its cloud computing network as part of the sales pitch, saying there's no need to keep so much material resident on the tablet when it's conveniently available from the cloud.

The late Steve Jobs mocked 7-inch tablets, saying they were too small to be useful. From my own point of view, they're just the right size, handy and easily lugged around in a kit bag. They may actually fit a slightly different niche than that occupied by the iPad, but Amazon's effectively covered that base by pushing Fire as an e-book reader. Still, not being into e-books yet, I'm not inclined to bite for the time being.

COMMENT ON ARTICLE
BACK_TO_TOP

[MON 17 OCT 11] FEEDING THE NINE BILLION (5)

* FEEDING THE NINE BILLION (5): As discussed in previous installments, food production is dependent on a number of factors -- the supply of some of which, such as land, water, and fertilizer, is inflexible. So how can the world feed 9 billion people? Improved management of resources is of course important to that goal, but the belief is that three measures will be the most significant:

There's a huge gap between the most and least efficient producers, and it's not just a case of impoverished African farming bringing down the average. Farms in Western Europe and Eastern Europe have similar climates for growing wheat, but farms in the West are from two to four times more productive than farms in the East. To be sure, there's differences in infrastructure between the most and least efficient producers, but the difference isn't necessarily overwhelming. Ghana, for example, has plenty of land great for growing, but yields are low just because there's not much use of high-yield hybrid seed there. Almost all of Brazil's corn crop sprouts from hybrid seed these days; the country is now the world's third-biggest corn exporter. If Ghanian farmers started planting hybrid seed, their productivity would boom as well.

African farmers

Africa has been traditionally stuck in a vicious cycle, with poor farmers unable to earn enough money to improve their productivity, keeping them poor. Africa also traditionally had the problem that crops that are staples there, like sorghum and cassava, were secondary crops at best elsewhere, and so research into improved varieties of such plants was secondary as well. Thanks to globally higher prices, however, African farmers are doing better for themselves and starting to get ahead of the learning curve. There's also been an enhanced focus for research on "African crops", with improved varieties of sorghum and cassava now available.

Of course, improved seed doesn't do African farmers any good if they can't get their hands on it. There are people working on that issue, one of the important players being "Alliance for a Green Revolution in Africa (AGRA)", which has set up 45 companies to distribute improved seed; AGRA believes that a hundred such will be enough to provide a continental distribution system. Things are already improving, with AGRA officials saying that though only 10% of Kenyan farmers are using improved seed now, half will be by 2015. Even with improved seed, African still has tremendous issues with infrastructure, transport being a particularly weak link -- but in spite of that, since 2008 African per-capita food production has been increasing for the first time in decades. Some predict Africa will be a net food exporter within a generation.

* The "livestock revolution" mentioned above simply means a switch from traditional methods of animal husbandry -- letting chickens and pigs scratch around for a living, feeding them with scraps or whatever -- to modern factory processes, with caged animals run through a manufacturing system in which their diet and health are carefully controlled. It's not pretty, and in Europe there has been a public backlash against such "battery" systems, but it's efficient. That's important because meat is becoming an increasing proportion of the global diet, with the FAO estimating that from now to 2050 the proportion of the world's calories provided by meat will rise from 7% to 9%, and the share of eggs and dairy products rising more.

The efficiency is all the more important because meat production, as mentioned previously, has a bigger "footprint" than crop production in terms of each kilogram of output. Huge populations of livestock also present a public health challenge: many diseases that afflict humans were originally derived through livestock, and it continues to provide a conduit for pathogens to infect humans. The matter is serious because livestock production is growing by leaps and bounds, at a rate substantially faster than that of cereals. World meat production more than doubled from 1980 and 2007, as did production of eggs and dairy products. India has become the world's biggest dairy producer, with output tripling; Brazil's production of chickens has increased by over a factor of five, with the country now the world's biggest exporter. In roughly the same timeframe, China has increased production of milk and eggs by a staggering tenfold.

The efficiency of battery systems is indisputable. A free-ranging hen might lay 50 to 100 eggs a year; feeding her costs nothing, but a factory hen can lay 300 a year, and the cost of feeding her is covered by about 150 eggs. Selective breeding of factory hens has also made them more efficient, cutting the amount of feed needed to produce a given amount of eggs in half. Besides, there's no way to get real volume production using traditional animal husbandry; and high-volume production also gives bigger populations of animals from which to select the most productive and most disease-resistant for improving the next generation.

factory hens

There seems to be plenty of room for growth in livestock production. Brazil, a global agricultural powerhouse in most respects, still hasn't got completely on board battery systems, with some agricultural experts saying the country could easily double the productivity of its beef herds. To be sure, battery systems require a fair investment of capital, but researchers believe that small farmholders who can't build factories for their chickens and cattle can still benefit, by using practices such as controlled feeding, and in particular making use of improved breeds of livestock generated for

Livestock breeding has become a particularly impressive source of greater productivity. It's been steadily improving for decades, thanks to introduction of techniques such as artificial insemination, but new genetic analysis tools promise to accelerate progress in the field. As researchers learn to read the genetic "book" of livestock, they are increasingly able to identify traits for selective breeding without having to resort to trial and error methods.

The problem with battery systems, again, is their unfortunate resemblance to extermination camps for animals. Anyone willing to eat meat can hardly object to the slaughter of animals. However, even accepting battery systems, a case can be made for minimizing the cruelty of the system, and regulations have been gradually accumulating towards that end. Given the rising demand for meat, battery systems are likely here to stay -- but like it or not, those who run them will have to make concessions to public sensitivities. [TO BE CONTINUED]

START | PREV | NEXT | COMMENT ON ARTICLE
BACK_TO_TOP

[FRI 14 OCT 11] THE KILLING OF JFK -- THE BALANCE OF EVIDENCE (35)

* THE KILLING OF JFK -- THE BALANCE OF EVIDENCE (35): Was Oswald's 0.38 caliber Smith & Wesson revolver the weapon used to shoot Officer Tippit? One problem with the identification was that the weapon had been originally built to fire standard 0.38 caliber ammunition, but had been modified to fire "38 Special" ammunition -- which has the same casing but a 0.357 caliber bullet. That meant that the bullets didn't fly straight down the barrel, instead "rattling" from side to side as they exited. The result was inconsistent markings on the bullets, confounding standard forensic ballistic analysis. Both the Warren Commission and the HSCA had ballistic inspections performed on the bullets; the bottom line was that they couldn't absolutely match the bullets to the pistol, but they could have been fired from that specific pistol.

Four empty cartridge cases were recovered from the crime scene. Both the Warren Commission and the HSCA had them examined to see if they were compatible with the pistol as determined by firing pin and other marks on the cases. The unanimous conclusion was that the markings were consistent with having been fired from the pistol, to the exclusion of all other weapons.

Now it gets a little confusing. When Oswald was arrested, his revolver was fully loaded; it had been fired -- that is, it was dirty -- but there was no test to know if it had been fired recently or not. He had five 38 Special rounds in his pockets, all of which had been made by Western-Winchester; there were six rounds in the cylinder of the revolver, half of which had been made by Western-Winchester and the other half made by Remington-Peters. Of course, there's nothing strange about the mix of ammunition in the revolver, obviously Oswald could have bought two different brands of ammunition, just as certainly as someone might have different brands of batteries in the same electronic gadget. The tricky part is that, of the four bullets removed from Tippit's body, three were Western-Winchester bullets and one was Remington-Peters; but of the four cartridges recovered from the scene of the shooting, two of the cartridges were Western-Winchester and two were Remington-Peters.

As Vincent Bugliosi pointed out, every case will have a few features that don't add up easily. That's because some of the evidence may be bogus or at least misleading; or there may be some missing piece of evidence that allows it to add up. Evidence for a sinister conspiracy? For the suspicious, maybe, but the simpler suggestion is that Oswald fired five times, missed once, and one of the empty cartridge cases was never recovered. Helen Markham said she heard three shots fired, Jack Ray Tatum said "three or four", as did William Scoggins -- Scoggins noting that the shots were "fast", which would have made it easier to confuse their number. Ted Callaway told the Warren Commission in 1964 that he heard five shots, and some of the other witnesses at the car lots said they heard six; Oswald may have emptied the revolver, which would have given him a reason to reload it on the spot. It should be remembered that the Dealey Plaza witnesses varied over a range on the number of shots they heard as well.

Again, the discrepancy in the type of bullets is a puzzle, but it is hard to see where it goes. The bottom line is that the cartridge cases were matched to Oswald's revolver. It also seems like more than a mere coincidence that Tippit was hit by bullets made by two different manufacturers, and that Oswald's revolver was loaded with cartridges from the same two manufacturers. There's at least a half-dozen manufactures of 38 special ammunition these days, there were no doubt plenty of firms making it in the early 1960s; given a half-dozen different available brands of ammunition, the odds of selecting the same two brands would be 1 in 15 -- not tiny, but at least low enough to suggest it wasn't coincidence.

Conspiracy theorists also make a fuss about the chain of custody of the cartridges, but there's nothing unusual about that: conspiracy theorists are quick to suggest any inconvenient evidence was faked, and their standards for chain of custody are so high that they would render any prosecution impossible. The scrupulous concern over the matter immediately evaporates for evidence convenient to the conspiracy case, however. [TO BE CONTINUED]

START | PREV | NEXT | COMMENT ON ARTICLE
BACK_TO_TOP

[THU 13 OCT 11] GIMMICKS & GADGETS

* GIMMICKS & GADGETS: Medical researchers at the Technical University in Munich have developed an implantable sensor that can monitor cancer tumor growth, reporting status via a wireless link. The sensor tracks oxygen levels in surrounding tissue to check if the tumor is expanding. According to the project manager, Sven Becker: "There are some tumors which are hard to remove -- for example, close to the spine. You run the risk of cutting the nerve if you remove them surgically. Or the problem may be that the tumor is growing slowly, but the patient is elderly. In these cases it's better to monitor the tumor, and only treat it if there's a strong growth phase. Normally you would have to go to the hospital to be monitored -- using machines like MRI to detect the oxygen saturation. With our system you can do it on the go."

The team plans to add a medication pump to the chip that can release chemotherapeutic drugs close to a tumor if treatment is needed, providing a more selective "smart bomb" attack on the tumor than through "carpet bombing" the patient with general chemotherapy. The research group believes that a production device might be on the market in a decade.

* There are several other projects underway to develop implantable or stick-on patient monitoring schemes. One of the many pains of being a diabetic is having to take a blood sample on a regular schedule to determine blood glucose level. Now a team under Dr. Heather Clark of Northeastern University in Boston, Massachusetts, is investigating a better idea: "digital tattoos" that will allow diabetics to monitor their blood glucose levels with a cellphone. The same technology could be used to monitor other aspects of blood chemistry, for example alcohol levels.

Here's how the scheme works: a microscopic set of sensors is implanted under the skin, with the sensor array kept together by an oily agent. The elements of the sensor array include nanoparticles that bind to specific chemical targets -- glucose, alcohol, sodium. Binding to a target causes a particular nanoparticle to fluoresce. The array can be "read out" using a cellphone with an attachment to stimulate the fluorescent nanoparticles and read back the results. The technology is a long way from being fielded, but it offers the possibility that in the not too distant future, we could be implanted with medical sensors that would help keep continuous track of our health.

Researchers at the University of Illinois are investigating a "stick-on" tattoo, a flexible circuit mounted on a small sheet of plastic that can be applied simply by wetting the skin and slapping the circuit on. The tattoo obtains power from ambient radiation and apparently communicates via wireless. It easily stays on for a day, though after a week or two it sloughs off and has to be replaced. If the circuit is cheap enough to produce, it would little different than applying and wearing a band-aid as needed.

Another group of researchers, at the University of Sheffield in the UK, have come up with a related idea, a wound dressing that glows if an infection occurs. The technology is based on a gel containing molecules that bind to bacteria and activate a fluorescent dye, with the dressing glowing pink under ultraviolet light when dangerous levels of bacteria are present. The gel also works as an antiseptic. The work has been partly funded by the UK Ministry for battlefield medicine, though of course it has civilian applications as well. The research team believes clinical trials are two to three years down the road.

DOLPHIN TALE

* WIRED magazine ran an article on animal prosthetic limbs, most of which were not such news -- even here in Loveland, Colorado, there's a dog that prances around energetically on four prosthetic paws. However, one of the case studies was an attention-getter: a female bottlenose dolphin named Winter who lives at the Clearwater Marine Aquarium in Clearwater, Florida, star of a recent movie named DOLPHIN TALE. Winter got her tail caught in a crab trap and the tail had to be amputated. Prosthetic experts built her a new tail, which meshes with the stump using a sock made of a special gel to improve comfort. The gel is now being used in human prostheses.

COMMENT ON ARTICLE
BACK_TO_TOP

[WED 12 OCT 11] FLOATING SOLAR ARRAYS

* FLOATING SOLAR ARRAYS: As reported by an article from THE NEW YORK TIMES ("Solar On The Water" by Todd Woody, 19 April 2011), solar power system manufacturers have been energetic in expanding their range of offerings. The latest innovation is the floating solar array.

In Petaluma, California, 144 solar panels sit on pontoons moored in an irrigation pond surrounded by vineyards. Not far to the north, in the heart of Napa Valley, another array of 994 solar panels covers the surface of a pond at the Far Niente Winery. What's the attraction of floating solar? Says Larry Maguire, boss of Far Niente: "Vineyard land in this part of the Napa Valley runs somewhere between $200,000 and $300,000 an acre. We wanted to go solar, but we didn't want to pull out vines."

The company that installed the two arrays, SPG Solar of Novato, California as well as Sunengy of Australia and Solaris Synergy of Israel, are among the companies energetically trying to build a market solar panels on agricultural and mining ponds, hydroelectric reservoirs and canals. While it's a niche market, it's potentially a big one on a global scale, and the companies report considerable customer interest. Developing countries that are short of power and usable land but are long on water have found the concept particularly attractive, though customers in the developed world like the idea as well.

Sunengy, based in Sydney, said it had signed a deal with Tata Power, India's largest private utility, to build a small pilot project on a hydroelectric reservoir near Mumbai. Solaris Synergy, meanwhile, said it planned to float a solar array on a reservoir in the south of France in a trial with the French utility EDF. MDU Resources Group, a giant mining and energy infrastructure conglomerate based in Bismarck, North Dakota, has been in talks with SPG Solar about installing floating photovoltaic arrays on settling ponds at one of its California gravel mines.

According to Bill Connors, a senior official at MDU: "We don't want to put a renewable resource project in the middle of our operations that would disrupt mining. The settling ponds are land we're not utilizing right now except for discharge, and if we can put that unproductive land into productive use while reducing our electric costs and our carbon foot footprint, that's something we're interested in." Connors didn't mention costs, but he pointed out the obvious: "We wouldn't be looking at systems that are not competitive."

SPG Solar's main business is setting up conventional solar systems for homes and commercial operations. It built Far Niente's 400 kilowatt floating array on a pond in 2007 as a special project, to then go on to develop a commercial version called "Floatovoltaics", which is now floating on the Petaluma pond. Says Phil Alwitt, a project manager for SPG Solar: "We have been able to utilize a seemingly very simple system, minimizing the amount of steel. With steel being so expensive, that"s our main cost."

The Floatovoltaic array is built around standard photovoltaic panels manufactured by Suntech of China. They're mounted on a lattice framework, set at an 8-degree angle to catch more sunlight, with the framework kept above water by pontoons and anchored in place with tie lines. The array features 2,016 panels and generates a peak power of a megawatt of electricity.

The companies selling floating solar systems claim the cooling effect of the water yields a slight increase in power compared to ground-based systems, as well as reducing evaporation and slowing the growth of algae. However, these are merely fringe benefits, the real selling point being renewable power without taking up valuable real estate.

floating solar arrays

Solar entrepreneurs have been playing up the benefits of floating solar arrays to the California State Water Project, seeing the state's aqueducts and reservoirs as ideal sites for floating solar arrays. Indeed, setting up floating solar arrays on reservoirs behind hydroelectric dams would be a very attractive combination. The dam already has a power distribution system that could be expanded to handle the solar arrays; solar power could provide power during the day, with the dam conserving water for providing power at night. Are floating solar panels the answer to the world's energy problems? Probably not, but if floating solar is a going business proposition on on its own terms, people are going to run with it.

* In other solar news, the high-altitude Atacama Desert of Peru is well-known to astronomy enthusiasts; it's one of the driest places on Earth, making it an excellent site for astronomical observatories, most notably the European Southern Observatory complex. According to BUSINESS WEEK, the sunny climate also makes the Atacama Desert an ideal site for solar power, with over a dozen plants in the works. The primary customers are mining operations that exploit the Atacama region, mostly for copper; the mines and their associated refineries are energy-intensive, and they strain the ability of Peru to provide the power they need. By providing power locally, the solar plants will be able to provide electricity at rates competitive with conventional power plants.

COMMENT ON ARTICLE
BACK_TO_TOP

[TUE 11 OCT 11] HPV VACCINE BATTLE

* HPV VACCINE BATTLE: The political race for the 2012 election is heating up, with Republican contenders facing off to see which one of them will become the party's candidate. As reported by an article in THE NEW YORK TIMES ("Remark On HPV Vaccine Could Ripple For Years" by Denise Grady, 19 September 2011), during a debate among the hopefuls Michele Bachmann, darling of the fundamentalist Right, sniped at Texas Governor Rick Perry, by no means a flaming liberal himself, for stepping out of bounds in 2007 when he ordered a vaccination program for schoolgirls to protect them the human papilloma virus (HPV), which can cause cervical cancer. The Texas legislature objected and the matter went no further, but Bachmann went beyond criticizing executive overreach in calling the HPV vaccine "dangerous", and adding later that the vaccine could cause mental retardation.

The medical and public health community fired back at Bachmann immediately, with the American Academy of Pediatrics (AAP) issuing a statement that led off with saying the organization "would like to correct false statements made in the Republican presidential campaign that HPV vaccine is dangerous and can cause mental retardation. There is absolutely no scientific validity to this statement. Since the vaccine has been introduced, more than 35 million doses have been administered, and it has an excellent safety record."

Bachmann waffled, saying she wasn't a doctor or scientist, she was just relaying stories she had heard. Perry defended the use of the vaccine, though he admitted he handled the vaccination program poorly. Few think the episode was to Bachmann's credit, but the harm may have been done. When politicians or celebrities raise alarms about vaccines, no matter how absurd the accusations vaccination rates drop. Dr. Rodney E. Willoughby, a professor of pediatrics at the Medical College of Wisconsin and a member of the committee on infectious diseases of the AAP, said: "These things always set you back about three years, which is exactly what we can't afford."

Historically, Willoughby said, vaccine scares have caused vaccination rates to drop for three or four years, and have led to outbreaks of diseases that had previously been under control, like measles and whooping cough. Measles cases in the USA reached a 15-year high in the spring of 2010, with more than 100 cases, most in people who had never been vaccinated. Once the disease begins to reappear, parents become worried and start vaccinating again. With cervical cancer, Willoughby said: "Unfortunately, the outbreak is silent and will take 20 years to manifest."

Although use of the HPV vaccine has been strongly recommended by medical groups for 11- and 12-year-olds, its use was already low before Bachmann spoke out against it. While there's been agitation against vaccines for as long as vaccines have been around, current fears are based on worries that vaccines can cause childhood autism -- though no replicable studies have shown they do. A recent report by the Institute of Medicine, which advises the US government, found that the HPV vaccine was safe, though it did say evidence hinted at the possibility of severe, if rare, allergic reactions.

HPV infection is extremely common, the virus being the most common sexually transmitted pathogen in the United States. More than a quarter of girls and women ages 14 to 49 have been infected, with the highest rate, 44%, in those ages 20 to 24. HPV is usually harmless, usually showing up when it does show up by generating warts, but in a small proportion of victims it can cause cancer -- not just cervical cancer but also, it now seems, cancers of the penis, anus, vagina, vulva and throat.

human papilloma virus

Studies by the US Centers for Disease Control & Prevention (CDC) show that only 32% of teenage girls got all three shots needed to establish the HPV vaccine, well below the use of two other vaccines that were licensed around the same time, one for meningitis and a combination shot against tetanus, diphtheria and whooping cough. Why the unusual prejudice against the HPV vaccine? It seems to be due to the fact that HPV is a sexually-transmitted pathogen, with parents worrying that their young daughters are being prepped for sexual activities.

Dr. William Schaffner, an infectious diseases expert at Vanderbilt University, agreed that 11 or 12 is "a pretty tender age, and parents are having a hard time getting used to this concept." However, as is typical of vaccines, the HPV vaccine must be given before a person is exposed to the virus or it will not work. As Schaffner put it: "Here we'd like to get it completed before the young woman initiates her sex life. Of course parents, particularly fathers, think that's going to happen at around age 34." Willoughby suggests that it's the fact that the girls are close to puberty when they get the vaccine that makes parents nervous, and suggests it might well be administered earlier with less controversy.

There are many strains of HPV, but two of them, known as Type 16 and Type 18, cause 70% of all cervical cancers; other strains can cause genital warts. One version of the vaccine, Gardasil, made by Merck, works against the two cancer-causing strains and two other strains that are the most common causes of genital warts. Gardasil was approved for use in boys in 2009 to prevent genital warts, but medical groups such as the AAP have not yet recommended it. Another version, Cervarix, made by GlaxoSmithKline, only protects against the cancer-causing strains, and is approved only for females. By June 2011, more than 35 million doses of the two cervical cancer vaccines had been distributed in the USA, with no reports of dangerous side effects. In studies comparing women who were vaccinated with those who were not, the vaccines were 93% to 100% effective at preventing infection with HPV Type 16 and Type 18.

Some critics of the vaccine have said it isn't needed in the USA since cervical cancer is no longer common here, thanks to pap tests that find precancerous growths so they can be removed. However, deaths are only a component of the trouble caused by HPV. Several hundred thousand women a year in the United States need surgery for precancerous lesions caused by the virus, and many more are treated for other cervical abnormalities linked to the infection. HPV may not kill them, but can still impair their fertility and otherwise make their lives miserable. The facts are all in favor of the HPV vaccine, but public resistance to it and other vaccines persists, much to the exasperation of doctors and public health officials. They can only keep on repeating that the risks presented by the vaccine are lost in the statistical noise -- while the risks of not using it are very real and tangible.

COMMENT ON ARTICLE
BACK_TO_TOP

[MON 10 OCT 11] FEEDING THE NINE BILLION (4)

* FEEDING THE NINE BILLION (4): The third component of the fundamentals for food production, fertilizer, was the most critical for the boom in food production during the 20th century. The most important fertilizer is nitrogen; traditionally it was provided by manure and guano, but the Haber process -- developed during World War I to produce ammonia out of the atmosphere, originally for manufacture of explosives -- permitted production of fertilizer limited only by the ability to construct the factories to do it. Without ammonia for fertilizer big increases in crop yields would not have been possible, and the Haber process is ranked as one of the most important inventions of the 20th century.

Impoverished African farmers only use an average of about 10 kilograms of fertilizer per hectare, while Indian farmers use about 180 kilograms per hectare. India isn't vastly more prosperous than Africa, and some agricultural experts believe that African farmers could double yields by the simple measure of doubling fertilizer use. On the other hand, it's possible to overdo it. Fertilizer use in China is heavily subsidized, with the result that since 1990 its use has risen by about 40% -- but Chinese crop yields haven't improved by much in that time. The Chinese could cut fertilizer use by a third with no difficulty, and in fact it would be helpful: runoff of excess fertilizer causes runaway algal blooms in bodies of water, leading to anoxic "dead zones".

Overall, however, global agricultural production could benefit from more fertilizer -- but nobody's expecting that much more fertilizer to become available. While the Haber process can extract as much ammonia as desired from the atmosphere, it does so at considerable cost in plant infrastructure and energy, and there doesn't seem to be the money to expand capacity by any great factor. In addition, there are now worries we're running out of phosphorus, a secondary but critical nutrient, and that may be placing a bottleneck on fertilizer production that can't be dealt with just by building more infrastructure. Fertilizer costs went to higher peaks in 2007:2008 than food did.

* Along with the resources for plant production, there are also the troublesome external factors that undermine it. For one, crops suffer from a wide range of pests and afflictions, from locusts to fungal smuts to viruses, that take their toll on food production. Agritech fights back with pesticides, anti-fungal chemicals, new plant varieties, and as of late GM plants featuring improved defenses against pests. Such measures have proven generally able to hold the line, but measures such as pesticides can have side-effects, and worse all types of pests demonstrate a seemingly determined ability to evolve around defenses, a "Red Queen's race" in which those who devise new defenses have to run as fast as they can to stay in the same place. Not only do pests keep getting trickier, but they also have a nasty tendency to hitch intercontinental rides on human transport, turning local pests into global ones.

locust plague

At present, the general belief is that pests are burdensome but manageable, that we are keeping up in the Red Queen's race. What is more worrisome is climate change. Critics of climate change scenarios dismiss them as hysteria, but farmers are not always so complacent, finding that increasingly erratic weather is making life difficult for them. Says a Bangladeshi farmer: "I know I am supposed to sow by a certain time or date. That is what my forefathers have been doing. But then for several years the temperature and weather just does not seem right for what we have been doing traditionally. I do not know how to cope with the problems."

Agriculture is clearly a contributor to climate change, with some estimates placing the farming industry's contributions to emissions at a third of the total, partly from emissions by machinery used in production and distribution, partly from destruction of wildlands for farmlands. The process may end up being self-limiting. While global warming may allow crops to be grown in places previously too cool to support serious agriculture, and increased carbon-dioxide concentrations will improve plant growth, studies that the gains will not come close to matching the losses in regions where crops will become harder to grow.

The International Food Policy Research Institute (IFRI) put together a model of what food production might be like in 2050 given the scenario of climate change, and came to the alarming conclusion that on that basis food production will be lower in 2050 than it was in 2010. Wheat will suffer the worst, with the fertile plain of the Ganges River in India being the hardest hit. Research is being conducted on new crop varieties that can tolerate more extreme conditions, but those working on such modified crops say they effort is underfunded and not nearly adequate to meet the challenge.

* As a footnote to the issues of producing enough food, there is also the issue of food wastage. The general estimate is that from a third to half of all food grown is wasted, with the ratio being much the same in poor countries as it is in rich countries.

However, the mechanisms of loss in the two places are very different. The problem in poor places like Africa is, once again, lack of resources -- limited storage facilities for crops, refrigeration, and transport. Without proper silos, grain crops are whittled away by rats, mice, and locusts; without refrigerated facilities and transport, milk and vegetables spoil before they get to market. Fixing such problems demands infrastructure investment, but the value of the produce saved makes the infrastructure pay for itself fairly rapidly. Still, it means a lot of money, and building up infrastructure can't happen overnight.

The wastage in the rich world mostly takes place at the consumer end, not the producer end. Studies in the US and the UK show that a quarter of the food obtained by restaurants or direct consumers goes into the garbage disposal. The worst case is salads because they tend to go bad fairly quickly; about half of the salad fixings sold end up being trashed. A third of all breads, a quarter of fruit, and a fifth of vegetables are disposed of uneaten. In the USA, this wastage amounted to 43 million tonnes in 1997; and in the UK, to 4 million tonnes in 2006. Extrapolating to the rest of the rich world, the wastage probably runs to about 100 million tonnes a year. If this wastage could be cut in half, there would be plenty of food for a global population of 9 billion.

Unfortunately, nobody has really good ideas for how to make that happen; the most that anyone's been able to do is implement schemes in which waste food is used as a feedstock for digester systems to produce methane for fuel. The truth is that food is still relatively cheap in the rich world, and nobody sees it as becoming so much more expensive as to seriously curb the inclination to toss it out. [TO BE CONTINUED]

START | PREV | NEXT | COMMENT ON ARTICLE
BACK_TO_TOP

[FRI 07 OCT 11] THE KILLING OF JFK -- THE BALANCE OF EVIDENCE (34)

* THE KILLING OF JFK -- THE BALANCE OF EVIDENCE (34): The other witnesses to the Tippet shooting didn't get as good a look at it as Helen Markham. Domingo Benavides saw the shooting and could corroborate the general details of the incident; as noted, he told the Dallas police he couldn't identify the gunman, though he did tell the Warren Commission later that it was Oswald. One detail he told the commission would become more significant later:

BEGIN QUOTE:

I think there was another car that was in front of me, a red Ford, I believe. I didn't know the man, but I guess he was about 25 or 30, and he pulled over. I didn't never see him get out of his car.

END QUOTE

That turned out to be the red Ford Galaxie XL500 owned by Jack Ray Tatum. Tatum's testimony has been slighted by conspiracy theorists because he didn't come forward until the HSCA got wind of him, but though that is an issue, his reluctance to say anything also suggests he was no sensationalist or publicity hound. The "red Ford" was corroborated by other witnesses, and Tatum's own comments gave a description of the scene that matched the other reports:

BEGIN QUOTE:

TATUM: I was preparing to turn left on 10th Street from Denver. I noticed an individual walking in my direction with a light, zippered jacket on, darker pants, and a squad car pulling over to the curb next to him. As ... I approached the squad car, I noticed that that individual was leaning over, talking to the officer. He had both hands in the pockets of his jacket. I continued through the intersection and about in the middle of the intersection, I heard three, maybe four shots.

He then looked around, surveyed the situation and started a slow run toward my direction. I put my car in gear and drove forward and watched him through the rearview mirror. I saw him very clearly and realized that there was one thing that made him stand out and that was his mouth that curled up. I couldn't mistake that.

INTERVIEWER : Kind of ... a smile?

TATUM: Yes ... And I was within 10:15 feet of that individual and it was Lee Harvey Oswald.

END QUOTE

Taxi driver William Scoggins saw Oswald confronted by Tippet. Scoggins didn't see the shooting itself, but saw the policeman fall, and then saw Oswald leaving the scene of the crime with a pistol in his hand. Virginia and Barbara Davis heard the shots and Markham's screams, to look out the window and spot Oswald reloading his revolver; Benavides also saw the shooter emptying the revolver, tossing the empty cartridges, and reloading.

Oswald then went past two used-car lots with the pistol raised in his hand. He was identified by six witnesses at the used-car lots, including Ted Callaway, as mentioned, as well as Sam Guinyard, B.M. Patterson, Harold Russell, and Warren Reynolds. Patterson and Reynolds followed Oswald, until he gave them the slip at the Texaco gas station. A woman named Mary Brock at the gas station also identified Oswald. That gave about ten witnesses who identified Oswald as the man who shot Tippit or was fleeing the scene of the crime with a pistol in his hand. Even if the odds were 50:50 of any one witness correctly identifying Oswald, the odds of all ten people getting the ID wrong were less than one in a thousand.

Conspiracy theorists have claimed that only one witness, Helen Markham, actually saw the shooting, discounting Tatum's testimony as unreliable. However, the other witnesses heard the shots and saw Oswald leaving the scene of the crime with a pistol in his hand. None of these witnesses saw any other suspicious figure associated with the shooting.

* Conspiracy theorists have played up witnesses with contrarian accounts of the Tippit shooting. One was Acquilla Clemons, who said she saw two men at the scene. Clemons only really came to light thanks to Mark Lane, who claimed in 1966 that the authorities knew she was a witness but had refused to call her to the stand, and in fact according to Clemons they warned her to keep her mouth shut.

The problem is that there is no record of Clemons having come forward after the killing of Tippit, and since her house was well down the street from the the shooting, there was no reason for the authorities to ask her about it. Lane was only tipped off to Clemons by a 1964 article written by two conspiracy theorists who had interviewed her named George and Patricia Nash -- how they came across her is unclear. However, the Nash's article said nothing about two men, and in fact the Nashes found Clemons "rather vague", suspecting that she was just feeding them "second-hand accounts by others at the scene." That may have been too harsh: Clemons may have come out after the ruckus and seen people, such as Ted Callaway, who had come to Tippit's police car after the shooting.

Another contrarian witness was Frank Wright, who also lived down the street from the shooting. He claimed that the shooter actually drove away from the scene of the crime. Nobody else saw any such thing; Wright may have actually seen Callaway with Tippit's pistol, driving away with Scoggins. Wright also claimed that there "wasn't anyone else out there at all" at the scene of the shooting of Tippit, though after the shooter departed, it started accumulating people. When the ambulance arrived the driver, Clayton Butler, said there were at least ten people standing around. [TO BE CONTINUED]

START | PREV | NEXT | COMMENT ON ARTICLE
BACK_TO_TOP

[THU 06 OCT 11] SCIENCE NOTES

* SCIENCE NOTES: It is well known that the confetti-sized fragments of plastic goods lost at sea have a tendency to accumulate in patches of ocean. Surprisingly, a survey of the western North Atlantic over the past two decades shows the amount of plastic in these regions doesn't seem to be on the increase, even though plastic production has more than tripled in that timeframe. Where is the plastic going?

On a recent cruise in the North Atlantic, researchers of the Sea Education Foundation (SEA) at Woods Hole, Massachusetts, examined plastic particles retrieved from the ocean. Superficially the particles seemed smooth, but under an electron microscope they turned out to be "covered with microbes", as one of the SEA researchers put it. Individual microbes seemed to be eroding pits into the plastic, "like hot coals burning through snow". The particular microorganisms haven't been characterized yet, but the evidence found so far that microorganisms are digesting the plastic is intriguing. It is possible that new variants of oceanic microorganisms are emerging that have a stronger appetite for plastics, which means that the problem might end up taking care of itself?

* The pollination services provided by honeybees are extremely important to modern agriculture, and so there has been considerable concern over the decline of honeybee hives in recent years. Nobody's 100% certain of the cause of "colony collapse disorder", but it is suspected to be due to a combination of pressures, including parasites, most notably mites; new strains of viruses that afflict bees; and fungal infections.

in search of better bees

Some researchers are taking a straightforward approach to solving the problem by breeding tougher bees. Researchers at the University of Manitoba and the University of Guelph have been distributing across Canada queen bees from hives that show some resistance to mites, to then select from the distribution queens from the most robust hives; they've gone through seven generations so far. Breeding programs are also in progress at the University of Minnesota and at the US Department of Agriculture's (USDA) research labs in Baton Rouge, Louisiana.

Says one USDA researcher: "We are looking for bees that are resistant to mites and with a greater tolerance to viruses because they appear to be the two main factors behind colony loss." Some species of bees, particularly Russian bees, have behaviors to deal with mites, with the bees able to clean themselves or even perform "sweeps" of a hive to clean out the mites. The USDA is current breeding Russian bees at Baton Rouge.

Another desireable attribute is the ability to withstand harsh North American winters. While traditionally less than half of the population of European bees survive the winter, the latest generations have a 75% survival rate. Whether bees can be improved fast enough to keep pace with threats remains to be seen.

* As reported by BBC WORLD Online, an international consortium of genomics researchers is now beginning the awkwardly-named "5000 Insect & Other Arthropod Genome Initiative" which, just like the name says, will unravel the genomes of "bugs" from all over the world. While the project has exciting pure-research aspects -- for example in acquiring a better understanding of the evolutionary history of insects -- of course the people involved are stressing the practical benefits, such improving pest-control measures through a better understanding of the pests.

* New and interesting fossils keep popping up all the time. A case in point is a 70 million year old dinosaur egg fossil found in the Patagonia region of Argentina. The eggs were from a "titanosaur", a sauropod dinosaur related to the classic brontosaurus -- yes, that's an obsolete name, but it's the one most widely recognized. The egg fossils were found in 1989, but it was until recently that one the broken eggs was examined and found to contain tiny sausage-shaped structures, about 2 or 3 centimeters long and a centimeter wide (about an inch by half an inch). The fossil structures closely resembled fossilized insect cocoons, and were most similar in size and shape to the cocoons of some species of modern wasp.

fossil egg with wasp cocoons

Wasps are often parasitic, preying on a wide range of other arthropods; this is the first evidence that they also evolved into scavenging niches, in competition with flies, beetles, and other scavengers that would descend on broken eggs. Given the size of the dinosaur eggs, about 20 centimeters (8 inches), they would have been an attractive source of nutrition for wasp grubs -- though it is also possible that the wasps were engaging in their more familiar parasitic habits, parasitizing beetles or other scavengers eating the broken egg.

COMMENT ON ARTICLE
BACK_TO_TOP

[WED 05 OCT 11] JUTE MAKES A COMEBACK

* JUTE MAKES A COMEBACK: Jute is a plant of South Asia that has long been used as a source of fiber for sackcloth, mats, and cord. As reported by an article from BBC WORLD Online ("Bangladesh's Golden Fibre Comes Back From The Brink" by Anbarasan Ethirajan), at one time it was a major export crop of Bangladesh, the country's "golden fiber", but it was largely displaced by synthetic polymers from the 1980s. However, in the 21st century biodegradable jute is seen as more environmentally-friendly than synthetics and the jute business is booming, with Bangladesh exporting more than a billion dollars' worth of the stuff this year.

Jute is a tall, slender, leafy plant; when it reaches full growth it is cut down, bundled up, and soaked in water for a few weeks to allow the constituents of the plant holding together its fibrous material to rot. After soaking, the plant is stripped into fiber and dried, with the product then taken to market. Jute is the second most important natural fiber after cotton in terms of cultivation and usage. It is mainly grown in eastern India, Bangladesh, China and Burma. India is actually the biggest grower of jute but consumes the bulk of production internally due to national laws specifying its use, leaving Bangladesh the biggest exporter. Bangladeshi jute is regarded as being of the highest quality, though India has more advanced agritech for jute production.

jute production

Along with biodegradability, jute is cheap, strong, and durable. The international enthusiasm for jute has been enhanced by new uses invented for the material. Says Mohammad Asaduzzaman, a scientist at the Bangladesh Jute Research Institute in Dhaka: "By processing the fiber mechanically and by treating it chemically, now jute can be used to make bags, carpets, textiles and even as insulation material."

Industrial-grade jute fabric can be used as a "geotextile" for soil-erosion control and reinforcing roads. Jute also makes a good feedstock for pulp and paper, and Bangladeshi researchers are working on a scheme to blend jute with cotton to produce denim, reducing the need for relatively expensive cotton and hopefully cutting the price of denim in half. The Bangladeshi government has encouraged the industry by mandating use of jute bags for packaging food grains.

When jute began its decline, Bangladeshi jute mills were shut down and farmers switched to rice. Now the farmers are coming back, with about 5 million Bangladeshis growing the crop, making it a major component of the country's agricultural industry. Bangladeshi researchers say that much more could be made of jute if more money was invested into crop and manufacturing studies.

jute fabric

* I found this article interesting because I had earlier run across a comment on jute in Field Marshal Viscount William Slim's DEFEAT INTO VICTORY, his memoirs of the war against the Japanese in India-Burma. The theater was regarded as low priority and so Allied forces there had to make do with what they could scrape up. Allied strategy ultimately relied heavily on air power, one major aspect being supply of forces operating in the Burmese jungle through parachute airdrops. The problem was that shipments of parachutes to the theater were hopelessly inadequate to needs -- but since the front lines were in what is now Bangladesh, there was plenty of jute, and Slim's people came up with a jute parachute after about a month's tinkering.

The "parajutes" were made with both jute cord and cloth, making maximum use of the material. While parachutes traditionally have a hole in the top to keep air from spilling out the sides, resulting in an unstable drop, the loose weave of the parajute meant the hole wasn't needed, simplifying construction. Although the parajute fell too fast to handle a paratrooper, it worked perfectly well for cargo drops, and it only cost a twentieth as much as a standard-issue parachute -- while permitting a supply of as many parajutes as needed. Slim commented that the response of headquarters in Delhi was to reprimand him for not working through normal channels; in modern terms, he replied that they should get a life.

BTW, for World War II enthusiasts DEFEAT INTO VICTORY is quite a book, detailing how the Allies were run out of Burma in humiliation in 1942 but, after a nasty learning curve, returned to utterly crush the Japanese in 1945 even though the opposing forces were about equal in number. Thanks to the obscurity of the theater, Bill Slim remains one of the least-known senior British generals, but many military scholars regard him as the finest general Britain produced by Britain since the Duke of Wellington -- and on reading him, he comes across as one hell of a stand-up guy.

COMMENT ON ARTICLE
BACK_TO_TOP

[TUE 04 OCT 11] SATELLITE SERVICING GETS REAL?

* SATELLITE SERVICING GETS REAL? The concept of on-orbit servicing of spacecraft has been around since the early days of spaceflight, but except for some high-profile cases, like the shuttle repair missions to the Hubble Space Telescope, it's turned out to be problematic. Shuttle repair missions were far too expensive to make economic sense, it being usually cheaper just to launch a replacement spacecraft instead, and the shuttle had no reach above low Earth orbit.

That left the option of using robot spacecraft for the repair job, but development of such technology has moved slowly. In 1997:1998, the Japanese "Engineering Test Satellite 7 (ETS-7)" performed a rendezvous with a cooperative target satellite and changed components on it using a remote manipulator arm. The US Air Force Research Laboratory flew the "XSS-10" smallsat in 2003 and the follow-on "XSS-11" smallsat in 2005, both testing robot rendezvous technology by maneuvering around the upper stage of their launch vehicles. The US National Aeronautics & Space Administration flew a mission designated the "Demonstration of Autonomous Rendezvous Technology (DART)" in 2005 as well to prove rendezvous technologies, but that flight was a failure. More encouragingly, in 2007 the US military's Defense Advanced Research Projects Agency flew a two-spacecraft mission named "Orbital Express", discussed here some years back, that successfully demonstrated automated rendezvous and docking technologies oriented towards space servicing.

Orbital Express

Uncertainty over cost-effectiveness of orbital servicing has played a role in the sluggish development of the concept, but now, as reported by an article in AVIATION WEEK ("Changing The Game" by Frank Morring JR, 21 March 2011), MacDonald, Dettwiler, & Associates (MDA) -- a Canadian space firm that has developed sophisticated robot arms for the International Space Station -- is looking towards development of an operational robot space servicing system, to be funded by a potential $280 million USD deal with Intelsat, the world's biggest satellite communications operator.

As envisioned, the robot servicer will be able to refuel or repair 75% of the commercial geostationary comsats now in operation. The servicer will use a lidar ranger to guide its approach to the target satellite from below, performing the final approach at a gentle speed of a few centimeters per minute. The servicer will then inspect the target satellite with a video analysis system to map out its features, and insert a grapple arm into the nozzle of the target's rocket kick motor. Once grappled, the servicer will mate up with the target and use its robot arm, supported by a flexible toolkit, to refuel or otherwise service the target. The target spacecraft will be temporarily taken to a higher orbit during servicing so an accident won't scatter debris along the geostationary orbit belt. Initial tests will be performed on an Intelsat comsat near the end of its operational life.

The servicer will be built at MDA headquarters in Canada and launched by a Russian Proton booster or an equivalent, with the spacecraft carrying a fuel load of about two tonnes (2.2 tons). Once that fuel is exhausted, the servicer will be refueled by new tanker spacecraft; the tankers will be relatively low-cost, with the servicer having the smarts to dock with them and draw their fuel. MDA will conduct servicing operations from Intelsat headquarters in Washington DC, installing workstations there and using Intelsat communications to support the mission. Intelsat has "done the numbers" on the repair and refueling scheme and sees it as cost-effective. Considering that this will be a pioneering endeavor, costs will necessarily include new technology development -- which means that if this effort pays off, other customers farther down the road will find the scheme even more attractive.

Although MDA is on the leading edge of commercial space servicing, there's competition -- in the form of ViviSat, a collaboration of US Space and ATK. ViviSat is taking a different approach to the issue, developing a simpler servicer that would generally only be used once, docking with a satellite and acting as an external fuel tank. However, it would also be able to dock with a satellite trapped in a low orbit by a launch failure -- a relatively common occurrence -- and boost it to its proper orbit with the servicer's own engine, to then undock and move on to its final target spacecraft.

Comsat manufacturers, who like to sell replacement satellites, are skeptical about the potential of robotic space servicing, arguing that the technology is too immature and the financial payback unclear. David Thompson, boss of Orbital Sciences Corporation, admits that space service is a coming thing, but adds that he doesn't think "it's something you're going to see as a purely private undertaking."

COMMENT ON ARTICLE
BACK_TO_TOP

[MON 03 OCT 11] ANOTHER MONTH

* ANOTHER MONTH: Having tried to operate a message board as part of the website only to see it eventually fizzle out, I decided to junk it, sorting through the postings before I did to see what was worth saving. One interesting posting concerned Samuel Arbesman, research fellow at Harvard Medical School, who re-imagined the Milky Way in the style of the famous London Underground map. Although the "Milky Way Transit Authority (MWTA)" was created for fun, it does provide an an accurate insight to the scale and locations of various nebulas, clusters and the solar system's location (Sol) in our galaxy.

Milky Way Transit Authority map

Arbesman said: "I had re-read Carl Sagan's novel [CONTACT] about a year ago, and in the story he alludes to some sort of cosmic Grand Central Station. That, coupled with my longtime interest in transit maps, got me thinking about how to understand the vastness of our own galaxy by using the concept of transit maps. Since transit maps are essentially beautiful abstractions for distilling a city down to a set of linkages and interconnections, perhaps a similar sort of thing could be done for the Milky Way."

* I'm not sure the message board would have been a good thing had it worked out better, since I've come to see online comments as a bad idea. I was reading the science site "physorg.com" for a time and finally got sick of the squabbling in the comments pages. Then I discovered "sciencedaily.com", which effectively covers the same materials but doesn't support comments. It was such an improvement: who needs comments? Same thing with "CNN.com", which has comments, and "time.com", which doesn't: same news, no barking.

I've gone through repeated cycles of getting involved in online feuds and then dropping out in disgust. The perverse thing is that the quarreling has an addictive quality that keeps drawing me back in, even though I know better. I finally realized it's a kind of porn, people stimulated by picking fights and people getting a comparable stimulation out of fighting back. Part of the impulse to chime in is the urge to be clever, but it's not clever; smart is walking away.

To help block the urge, I spent a hour or so tracking down a website blocking add-on for Firefox, and found a very effective one named "Leechblock", because it blocks out sites that bleed off time. It seems a little silly to go to such lengths instead of just not reading the sites, but without Leechblock it's just too easy to look them up on impulse and fall back into the rut. I can unblock sites if I want to, but I have to think through doing that and that gives me time to think twice. I've become very fond of blocking sites.

I can hope I've finally gotten out for good, but it's hard to say, once an addict, always. I do have a precedent. I grew up, like any American kid, glued to the TV set -- but in the late 1970s I watched an episode of POLICE WOMAN, one of the crop of 1970s cop shows starring Angie Dickinson as an attractive but very implausible lady cop. It was the last straw; the episode was so unbelievably bad that I effectively didn't watch TV at all for a decade, and when I started doing so again it was very selectively.

* Anyway, I've replaced the message board with a Bravenet guestbook as a minimal tool to let readers with an itch to leave a comment to do so. Ironically, I had a Bravenet guestbook about ten years ago as well, so things have gone full circle. I'm not expecting it to get much use. I set it up through a redirection page so I can change it easily if I decide I need something better.

COMMENT ON ARTICLE
BACK_TO_TOP
< PREV | NEXT > | INDEX | GOOGLE | UPDATES | EMAIL | $Donate? | HOME