* Entries include: JFK assassination (series), smart structures (series), future of world food production (series), laser enrichment of nuclear fuel, USPS going broke, Paris Autolib car networking scheme, 3-cylinder piston engines, flexible AC transmission for smart grids, cancer screening questioned, dust and climate change, solar ultraviolet and climate change, lunar X-prize, using bacteria to refine ores, and obtaining metals from geothermal plants.
* NEWS COMMENTARY FOR NOVEMBER: As reported by AVIATION WEEK, this last September a significant but almost-unnoticed event took place: the US and Vietnam signed a memorandum of understanding on military cooperation, with the agreement to lead to a liaison arrangement and naval interaction. The agreement was preceded by a visit of the US Navy replenishment ship RICHARD E. BYRD to Cam Ranh Bay, the first port call of a US Navy vessel to Vietnam in 38 years. Last year, a delegation of Vietnamese officials were flown out to the carrier USS GEORGE WASHINGTON for a red-carpet tour.
Americans still haven't quite forgotten their humiliation in the Vietnam, but the Vietnamese have a longer perspective of the matter. The Americans were only around for about a decade; the Vietnamese fought Chinese domination for centuries. It is China, in fact, that is driving US-Vietnam rapprochement, in specific Chinese ambitions in the South China Sea, backed by a naval force buildup on Hainan Island there. While Vietnam has traditionally obtained Russian weapons -- Vietnam has six KILO-class diesel-electric submarines currently on order for Russia, which will provide a significant boost to Vietnamese naval power -- the Vietnamese are also now obtaining Western gear. Deliveries are underway of three Airbus Military C212-400 maritime patrol aircraft, with six float-equipped Canadian-built Viking Air Twin Otter 400 machines on order. Three of the Twin Otters will be configured for patrol, the other three for utility / transport / VIP use, with convertible interiors.
Vietnam is now also strengthening ties with the Philippines, where there is similar concern over Chinese ambitions in the South China Sea. The Philippine military, long saddled with antique weapons, is now modernizing to the extent it can be afforded, in particular having recently obtained a new HAMILTON-class patrol cutter from the USA. The Philippine Air Force doesn't even have any fighter jets at present, but the country is investigating the purchase of half a dozen high-performance jet trainers, such as the Korean Aerospace T-50, that could also be used in combat roles.
Neither Vietnam nor the Philippines could match Chinese power if it came to a shootout, but they do seem determined to make it known that the Chinese won't push them around with impunity. If the Chinese then attempted to apply force majeure to get their own way, they might run the risk of being trumped by US Navy intervention -- a risk Hanoi wants to emphasize by making nice with Washington DC.
* Articles have been run in these pages in the past on corruption and how hard it is to root out. As reported by BUSINESS WEEK, six years ago India, long plagued by corruption, began an experiment in fighting corruption in the form of the "Right To Information (RTI) Act" -- somewhat similar to American's "Freedom Of Information Act", allowing citizens to request and obtain government documents. It took a while for the citizens to appreciate RTI, but now hundreds of thousands of RTI requests are made each year. They have led to arrests of officials on corruption charges, but their primary effectiveness is at a lower level: slum dwellers, for example, have learned they may not have to pay a bribe to get a ration card if they can perform an RTI request.
Good news? Yes, but with an ugly downside. On 16 August Shehla Masood, a Bhopal businesswoman, was gunned down in her car. She had been making RTI requests to reveal local government corruption after losing government contracts even though she was low bidder. At least a dozen murders have been linked to RTI requests, along with many more assaults on citizens who have made such requests. The violence over RTI, somewhat surprisingly, doesn't take place in the big cities like Mumbai, Delhi, or Bangalore; it is instead skewed towards the smaller towns, where local officials, police, and gangsters may have cozy relationships, establishing petty tyrannies.
Masood had been threatened before she was murdered, claiming she was being targeted by a specific Bhopal official. A reporter for BUSINESS WEEK tried to talk to the official but got the runaround, with his driver roughed up by local cops. The Asian Centre for Human Rights is lobbying to include RTI-related attacks in a "priority criminal investigations" category. The Information Ministry, which supervises RTI, says that when citizens are attacked after making RTI requests, the information will be made public, in hopes of discouraging corrupt officials from thinking they can "silence" protests. In the meantime, the Indian Central Bureau of Investigation (CBI) has taken over the Masood case from Bhopal authorities. One item that the CBI has turned up so far is that Bhopal police who found the body used Masood's cellphone to order food. The full story of the case promises to be interesting.
* On 12 November, a series of explosions rocked an Iranian military base north of Tehran where Shahab long-range missiles are stored, killing 17 personnel, including a general of high status in the country's missile program. Although the military said it was an accident, the Iranian press played it up as an act of sabotage, presumably by the Israeli Mossad intelligence service -- and Israeli media coyly didn't deny it, instead printing a list of various troubles suffered by Iran's weapons of mass destruction (WMD) program over the past few years, including assassinations of prominent Iranian weapons-development researchers.
What's really going on? Who knows who's talking? As TIME magazine pointed out, Iran's nuclear program has now effectively ceased to be covert, giving Iran's enemies -- Israel and the Great Satan America -- a green light for action. Overt measures being troublesome politically, covert measures might seem the better option, and it would certainly be perfect justice: "You pretend you don't have a WMD program, we'll pretend we're not sabotaging it."
Then again, in the absence of hard facts, can we honestly assume that an intensive "black ops" campaign is being conducted against Iran's WMD program? Maybe the devious plan is to get the Iranians to believe there is a devious plan when there really isn't, and watch them wear themselves to a frazzle trying to chase it down.COMMENT ON ARTICLE
* LASER ENRICHMENT: Enrichment of uranium to produce nuclear fuels has long been a difficult and costly process -- which to an extent is a good thing, because otherwise anybody who wanted to obtain the Bomb could do so easily. However, as reported by an article from THE NEW YORK TIMES ("Laser Advances In Nuclear Fuel Stir Terror Fear" by William J. Broad, 20 August 2011), the status quo of difficulty may be changing due to "laser enrichment" technology.
Notions of using lasers to enrich uranium go back to the 1960s, but nobody could turn the idea into a practical process. Now, giant General Electric (GE) has successfully tested laser enrichment and is seeking Federal permission to build a $1 billion USD plant that would turn out reactor fuel in quantity. That might be a big help to the nuclear power industry -- but it also might be a big help to rogue states and terrorists, allowing them to make bomb fuel in smaller and less expensive plants that are easier to hide. Advocates of laser enrichment say the fears are overstated and that laser enrichment could actually make nuclear power a major factor in America's energy equation. Critics want a detailed risk assessment and have been lobbying Washington DC for one.
GE, an atomic pioneer and one of the world's largest companies, says its got laser enrichment to work in 2009 at the Global Laser Enrichment plant just north of Wilmington, North Carolina, the facility being jointly owned with Hitachi. Details are classified, but GE officials say that results have been so encouraging that they are accelerating plans for a larger complex at the Wilmington site. Donald M. Kerr, a former director of the US Los Alamos weapons lab who was recently briefed on GE's technology, said in an interview that laser enrichment has gone from "an oversold, overpromised set of technologies" to what "appears to be close to a real industrial process."
For now, the big uncertainty centers on whether Federal regulators will grant a commercial license for the planned complex. The Nuclear Regulatory Commission (NRC) is considering the issue and has promised GE a decision by next year. The Obama Administration has said nothing much about the matter; the president is known to be enthusiastic about nuclear power, but is also energetic in nuclear arms limitation efforts, and so the administration's position is unclear.
Most uranium found in nature is in the form of the U238 isotope, which is useless for nuclear power applications. In any sample of unenriched uranium metal, about 0.7% is the U235 isotope, which supports fission reactions and so can be used as a fuel. Enrichment sorts out the U235 from the U238. If U235 concentrations are enriched to about 4%, the material can fuel nuclear reactors; to 90%, atom bombs. The difficulty of enriching uranium means that a kilogram of enriched metal costs well over $2,000 USD, not as much as gold but more than silver.
Using lasers for enrichment was proposed in 1963. The idea was to excite U235 in uranium metal with pure laser light; in principle, the resulting agitation would ease identification of the scarce isotope and aid its extraction. The expectation was that using lasers in enrichment would cut costs by an order of magnitude, and so dozens of research projects were conducted on laser enrichment around the world. However, none of them came close to coming up with anything practical, and so by the 1990s interest in the idea had generally faded out.
Not everyone gave up. Two researchers at an Australian government facility near Sydney named Horst Struve and Michael Goldsworthy kept tinkering with the idea, and by 1994 they had come up with something they believed had potential. They named their process "Separation of Isotopes by Laser Excitation (SILEX)". GE officials looked into SILEX, were impressed, and bought up rights to the process in 2006.
In late 2009, as GE experimented with its trial laser system, arms control advocates wrote the US Congress and the NRC to warn that laser enrichment might promote nuclear weapons proliferation, since it would make it easier to set up clandestine enrichment facilities. There were calls for a Federal review of the potential risks. The NRC wasn't responsive, and so late in 2010, the American Physical Society (APS) -- America's largest group of physicists, with headquarters in Washington -- submitted a formal petition to the NRC for a rule change that would compel such risk assessments as a condition of licensing. More voices have joined in to ask for a review, but the Nuclear Energy Institute, an industry group in Washington, has objected, saying new precautions were unnecessary because of voluntary plans for "additional measures" to safeguard secrets. An NRC spokesman said the petition would be considered in 2012.
GE officials, seeing the clouds gathering, did the sensible thing and performed their own assessment. They hired Kerr, who had not only had run Los Alamos but also has a background in US intelligence, to lead the evaluation. He and two other former government officials concluded that the secrets of the SILEX process were unlikely to leak out and that a clandestine laser plant stood a high chance of being detected. Kerr commented: "[The Wilmington plant is] a major industrial facility. Our observation was this was not something that would sit in a garage or be easily hidden." Indeed, Global Laser Enrichment's planned commercial laser enrichment plant will be half the size of the Pentagon building.
Critics reply that the GE plant is intended for large-scale production to provide fuel for nuclear power plants, but obtaining fuel for bombs could be accomplished with a much smaller plant. The Iranians have been tinkering in laser enrichment, but they've successfully kept their activities secret. That gives no reason to think they've gone anywhere with their research; after all, laser enrichment was intensively studied for decades with no success, and the Iranians are relative novices at nuclear technology. However, now it seems like someone did get it to work, and so the Iranians have reason to think they could get it to work, too. Can we be certain they're wrong?COMMENT ON ARTICLE
* SMART STRUCTURES REVISITED (2): The experiments in structure monitoring conducted by the Cambridge researchers in the UK used a small number of sensors, nowhere near as many as would be required for an operational application. As the number of sensors increases, data transmission rates become more of a problem. Given that the sensors necessarily transmit at low power, improving transmission rates by using more power is out; the sensors have to become smarter, for example by packaging up data compactly and only sending it out as required by events.
Jennifer Rice, at the time an engineer at the University of Illinois and now at Texas Tech University, led a team to develop a smart sensor network to monitor the Jindo Bridge in South Korea. The network had 113 nodes, each with six sensors, and used clever programming to control the resulting flood of data while also minimizing power consumption. The sensors were installed in the summer of 2009 to monitor tension in the bridge's suspension cables, as well as wind loading and deck vibrations.
Each of the nodes was programmed to stay in a low-power "sleep" state most of the time, but to wake up for a fraction of a second every ten seconds, to check out for signals from a gateway computer. A portion of the nodes took turns staying awake for longer periods, acting as "sentries", to look out for winds or vibrations above a certain threshold. The sentries then signaled the gateway to bring up the other sensors to a fully alert state. The smart operation of the network meant that sensors wouldn't run out of battery power for years; some of the sensors even had solar panels or wind generators to keep them charged up. The central computer controlling the network could assess input and send status reports to engineering staff.
* Some engineers want to go further, providing structures with the ability to react to problems instead of just reporting them. The technology is already here to a limited extent. For example, the Saint Anthony Falls Bridge that replaced the failed I-35W bridge in Minnesota has temperature sensors that can activate antifreeze spray systems to prevent the bridge from icing up.
More subtly buildings, most notably skyscrapers, can be threatened by resonant vibrations, and structures put up in windy or earthquake-prone regions have been built with systems to help them absorb vibrations. Traditionally, such systems have been passive. To protect against earthquakes, engineers can install shock absorbers between a building's support beams to dampen the oscillations and reduce damage. To deal with strong winds, heavy weights called "mass dampers" can be hung to change the building's resonant characteristics and minimize its motion. The Taipei 101 skyscraper, built in 2004 and standing 509 meters (1,670 feet) tall, uses a passive damper system.
Passive systems are inflexible, however, unable to respond to unusual circumstances, and so in recent years active vibration damping systems have been developed. Such schemes use sensors to monitor vibrations, sending signals to a control system that then uses actuators to move around weights on the upper to compensate. The World Financial Center in Shanghai, built in 2008 to a height of 492 meters (1,614 feet), uses an active damper. It should be noted that the world's tallest building, the Burj Khalifa in Dubai, completed in 2009 and currently the tallest building in the world at 828 meters (2,717 feet), has no damper at all because earthquakes don't happen in the region and its three-cornered design renders it immune to the resonance effects of wind.
Active systems are much more capable, but they are also vulnerable to power outages, which often occur during earthquakes or storms. The electronics subsystems can be run off battery backup in power failures, but the actuators required to move the heavy weights cannot. A team under Bill Spencer, a civil engineer at the University of Illinois at Urbana-Champaign, is investigating "semi-active" systems, based on shock absorbers as with passive systems, but the smarts of the semi-active system can relax or stiffen them as required. That requires much less power than shifting weights around, meaning the system can be fully operated off battery backup. So far, semi-active systems have only been installed in medium-sized buildings, such as the 238 meter (780 foot) Mori Tower in Tokyo, completed in 2003.
Engineers working on smart structures believe they can do much more. They could improve the way buildings control their heating and cooling systems, manage renewable-energy systems such as wind turbines and solar panels, and handle emergencies such as broken water pipes or electrical outages. Far down the road, a structure could dispatch robots to repair problems as they occur. That's not going to happen any time soon, but everyone working in the smart structures field believes the technology has enormous potential even in the short run. Robots can come later. [END OF SERIES]PREV | COMMENT ON ARTICLE
* THE KILLING OF JFK -- JACK RUBY (1): Jack Ruby was born as "Jacob Leon Rubenstein" in Chicago in 1911, the fifth of eight children of Joseph and Fannie Rubenstein, both Orthodox Jews. The parents separated when Jacob was ten. Jack was a unruly youngster, with a hot temper, inclined to get into fights, and very good at fighting. He always had a scheme to bring in a fast buck or two, but he never stole things, however, and never really had any serious problems with the law. Nobody who knew him ever thought of him as very bright.
Rubenstein went to California in 1933 to see if he could find a better life there, but he couldn't get ends to meet, so he moved back to Chicago in 1937, working there on various hustles. He worked as a union organizer in Chicago for a time, with conspiracy theorists seeing that as linking him to organized crime, but investigations into Ruby's union activities turned up no dirt.
When war came in 1941, Rubenstein got a deferment to help take care of his senile mother. He was reclassified and drafted in 1943, being inducted into the Army Air Forces and serving out the war stateside as an aircraft mechanic. He had no disciplinary problems in the service, engaging in minor hustles like running card games or selling chocolates to other soldiers to bring in some money. The other men had no problems with Rubenstein, though he did have a nasty temper: a sergeant called him a "Jew bastard" and Rubenstein beat him bloody.
Rubenstein was discharged in 1946 and went back to Chicago, to move down to Dallas, Texas in 1947. In that year, he legally changed his name to "Jack Ruby". After many comings and goings and unsuccessful deals, he helped run two nightclubs, the "Carousel Club" and the "Vegas Club", he and his sister Eva Ruby Grant had set up -- he spent most of his time running the Carousel Club, while Eva ran the Vegas Club.
There were tales that he had been working for the Mob in Chicago and was sent to Texas to help run the Mob's operation there, but later the FBI checked up on his background in both locales and found no corroborated witnesses to confirm a connection. Extensive wiretaps of Mob people never mentioned Ruby; gangland informants had never heard of him. [TO BE CONTINUED]START | PREV | NEXT | COMMENT ON ARTICLE
* GIMMICKS & GADGETS: A firm named BMT Nigel Gee, out of Southampton in the UK, is now working on a 150-passenger electric-powered catamaran ferry for China. The vessel will be 26 meters long and 8.5 meters wide (85 x 28 feet); it will obtain drive from a vanadium redox flow battery, a type of robust rechargeable battery with some resemblances to a fuel cell system, capable of being recharged instantly when necessary just by swapping out electrolyte. However, the ferry will also have a topdeck covered with a solar array to allow the battery to be recharged by sun power. Maiden voyage is set for 2012.
* Sand beds are often used as filters in water purification systems, but coarse sand doesn't filter well and water doesn't soak through fine sand very quickly. A multinational team of researchers has developed a solution that does both of them better, by filtering the water with coarse sand coated with graphite. The material, called "super sand", is easy and cheap to make, as well as highly effective, almost as good as activated carbon. In addition, the production process could be tweaked to optimize the super sand for different sorts of pollutants. It is seen as potentially very useful for developing nations, where the supply of clean water is very uncertain.
* As reported by THE ECONOMIST, while we tend to think of oil and solar power as polar opposites in terms of energy sources, that may not be necessarily so. Getting oil out of the ground is not a trivial task, and it gets trickier when the oil is thick and sludgy. Lowgrade oil can be obtained by pumping steam into the deposit; such "enhanced-recovery" techniques date back to the 1950s, and 40% of California's oil production now depends on this trick. The problem is that producing the steam requires a lot of energy, typically by burning natural gas -- which makes it expensive. Those who worry about carbon footprints also point out that the amount of gas used to obtain a barrel of Californian heavy oil results in emissions comparable to those from a barrel of oil obtained from Canada's notoriously dirty tar sands.
A California startup named Glasspoint wants to use sunshine to produce the steam. It turns out that the steam used for oil recovery doesn't have to be very hot and so solar energy can do the job. There is the problem that oilfields are typically grungy places, producing lots of dust unfriendly to solar power systems; Glasspoint plans to deal with this problem by putting the mirrors for producing the steam in greenhouses. Greenhouses are a well-established technology, being cheap to buy and set up, and they are also easy to clean and maintain. Putting the mirrors in greenhouses also keeps them out of the wind, so the mirror systems don't have to be as robust or expensive as those that would be exposed to the elements.
Some gas may still need to be burned to produce the steam, but the solar system should considerably reduce the amount of gas required -- and with a booming market for natural gas, that also means more gas to sell. Glasspoint officials believe that their solar approach will be only two-thirds as expensive as using natural gas to produce steam. A pilot project has been underway in California and has gone well, with the company now working towards production deals. Glasspoint officials believe that the sunny Mideast will be a prime target for solar oil-extraction technology even in the near term -- with the ironic result that within a decade the majority of solar power capacity will be used to extract oil.COMMENT ON ARTICLE
* USPS ON THE ROPES: As reported by an article in BUSINESS WEEK ("The End Of Mail" by Devin Leonard, 30 May 2011), the US Postal Service (USPS) is as old as the United States -- but now it is close to financial collapse.
It's been there before. Early in its history, recipients actually had to pay to receive mail, but in 1863 the Post Office, as it was called, started the radical notion of free delivery to urban citizens, expanding the same free service to rural citizens in 1896 to establish "universal service". That led to widespread public enthusiasm for the mails and the rise of the influence of the Post Office -- which unfortunately made it a useful source of political patronage. Eventually, the load on the system outstripped weak management, with the Chicago mail system visibly breaking down in 1966, with mountains of undelivered mail piling up in the post offices.
In 1970, President Richard Nixon signed the Postal Reorganization Act, establishing the modern USPS, organized in principle as a government corporation that operates like a private business. It didn't, actually, with the USPS providing a cozy deal for its unionized workers that hardly suggested a lean organization. But why not? The USPS didn't have real competition then and its services were in demand, being the normal way of sending documents.
In the late 1970s, however, the laws were adjusted to allow FedEx and UPS to intrude on USPS turf by handling express mail. They set their own rates and played tough with their unions; as one analyst put it: "They just cleaned the Postal Service's clock." Then in the 1990s email started taking hold. Who would go bother to send a letter if they could send an email instead? By 2000 the USPS was clearly in decline. In 2005 the volume of first-class mail fell below the volume of junk mail -- and junk mail was only a third as profitable to the USPS as first-class mail. By 2006, the US Congress was adjusting the USPS's pension liabilities in hopes of staving off disaster. When the economy went south, the USPS imploded.
The USPS now has over 570,000 employees, making it the second largest civil employer in the USA after giant retailer Wal-Mart. The USPS has almost 32,000 post offices. It's losing money steadily and can't match the competition. UPS has 53% of the express & parcel shipment market, while FedEx has 32%; the USPS straggles along with 15%. With first-class mail fading out, the USPS has in effect become a delivery system for junk mail, a fact that hardly enhances its public image. Says one analyst: "Pretty soon [the USPS] is going to be a government-run advertising mail delivery service. Does that make any sense? It doesn't make any sense."
* The interesting thing is that very much these same pressures have been affecting postal services in other countries, and many of them are thriving. In 2008, the US Congress directed the General Accounting Office (GAO) to figure out what was wrong with the USPS and what should be done about it. The task fell to a team under a GAO official named Phil Herr. The problems with the USPS weren't so hard to figure out; to figure out solutions, he sent a small team of analysts to Finland, Sweden, Germany, Switzerland, Austria, and Canada to observe how the postal services worked there. The results were fascinating.
Into the 1980s, postal services elsewhere were organized along lines similar to those of the USPS, but in the late 1980s the European Union set out to create a more unified postal system, with the various member states giving up their postal monopolies to permit competition. It worked spectacularly, with many countries shutting down most of their post offices, moving postal services into gas stations and convenience stores. Sweden's Posten only operates 12% of the country's post offices; Germany's Deutsche Post, now a private company, only runs 2% of the offices. Deutsche Post is doing so well that it was able to buy out DHL, a parcel shipper that competes with UPS and FedEx. Deutsche Post says that half the firm's employees are outside Germany, as are company profits.
Instead of cowering from the internet revolution, these postal services embraced it. Itella, Finland's postal service, keeps a digital archive of user mails for seven years and offers secure online bill payment services. Swiss Post allows customers to choose whether they want mail delivered hard copy, or scanned and sent as email; customers also have the right to ask that junk mail not be delivered. Sweden's Posten has a cellphone app that allows users to convert images from their cellphones to postcards, and is working on a digital code system to replace stamps. Posten is booming, having become part of NordPost through a 2009 merger with Denmark's Post Danmark, with NordPost business booming.
* Compared to such ferment, the USPS appears painfully inert. Herr has expressed frustration with the Postal Service, saying that when he tried to get USPS officials to tell him about their ten-year plan, the answer was: "We don't have one." However, Postmaster General Patrick Donahoe sounds many of the right notes, talking of cutting Saturday mail delivery, reducing headcount through attrition, and shutting down 2,000 post offices to move services to convenience stores and supermarkets. Many believe that while those are steps in the right direction, they are too little and too late. The USPS is now reaching its mandatory debt limit. As another observer puts it, the Postal Service "is either going to default on [its] obligations to its retirees, or we are going to have to give it a direct bailout from the United States taxpayers."COMMENT ON ARTICLE
* SHARE THE CARS: Urban schemes for sharing bicycles were discussed here earlier this year. Now, as discussed by an article from BBC WORLD Online ("Paris Launches Electric Car-Sharing Scheme", 30 September 2011), Paris is launching a new car-sharing project organized along the same lines in hopes of reducing traffic congestion and promoting electric vehicles (EVs).
The "Autolib" system is conceptually based on the successful "Velib" bicycle-rental service. Under a two-month pilot project, motorists will be able to rent out battery-powered "Bluecars" for a half-hour at a cost of 4 to 8 euros (about $5.50 to $11 USD). Membership in the Autolib scheme will cost from 10 euros (about $13.50 USD) a day, with a year subscription running at 144 euros (just under $200 USD).
The four-seat Bluecars are being provided by entrepreneur Vincent Bollore, being built by the Italian designer Pininfarina, famous for styling Ferraris and Maseratis. The EVs will have a respectable range of up to 250 kilometers (155 miles) before needing a recharge, which will take about four hours. At the outset, 66 EVs will be available for rent at 33 charging stations. However, Paris Mayor Bertrand Delanoe says the system will expand to 3,000 EVs and more than 1,000 charging stations by the end of 2012.
Autolib general manager Morald Chibout explained: "We want to persuade people to shift from the concept of owning a car to that of using a car." For people living in big cities that isn't such a leap of faith, since traffic density and lack of parking can make car ownership impractical. Traditional car rental is too troublesome for daily use, but Autolib is not competing with rentacar companies; the scheme is designed to make it easy to get hold of a car, with a pricing structure designed to encourage short trips.
If Autolib is successful, it should be a boost to EV manufacturers. Since Autolib is only intended to support urban traffic, the short range of EVs is not so much of an issue. The pricing is very attractive, since at $200 USD a year that's less than most people pay for insurance on their private automobiles. Of course, there's likely to be insurance considerations associated with the use of the Autolib network as well, but that detail wasn't discussed in the article.
* Ultimately, schemes like Autolib could lead to fully automated traffic systems, where users can request a car using a cellphone and the car driving itself to the pickup point. Once the user is done with the car, it then drives itself off to pick up another user. In an urban network, the cars would not need to provide all the smarts for automated driving themselves, instead collaborating with the smarts of the network. Each car would be able to handle itself in traffic, but the network would determine its routing, while using the distributed network of cameras and other sensors in the cars along with infrastructure sensors to determine exception conditions such as accidents or road damage.
* 3-CYLINDER ENGINES: While electric and hybrid cars are big automotive news these days, two car manufacturers are also making a bit of a splash with little cars running piston engines -- the wrinkle being they've lost a cylinder: "THREE is the new FOUR!"
Ford recently unveiled its 1-liter "EcoBoost" engine at the Frankfurt Auto Show, capable of 275 kW (118 HP), the company saying it will be offered in Europe later in 2012 in the Focus and other small vehicles -- though it won't be sold in the USA just yet. Volkswagen will offer a three-cylinder engine in the "Up!" compact, expected by the end of 2012. VW showed it off at Frankfurt in the form of the "eco Up!" concept car, powered by a three-cylinder natural gas engine.
Says an industry analyst: "This downsizing is happening globally. Sixes are replacing eights, fours are replacing sixes and now threes are replacing fours." Traditionally, small engines meant puny performance -- but not any more, thanks turbocharging, direct injection, variable-cam timing, start-stop operation, and other tricks. Ford claims its little three-cylinder will perform like a 1.6-liter four.
A three-cylinder four-stroke engine is tricky, because there's one less cylinder than strokes, meaning the engine is inclined to run in an unbalanced and destructively rough fashion. Ford engineers got around this problem by linking a lopsided flywheel to the crankshaft, the flywheel's off-balance behavior neatly canceling out the off-balance behavior of a drive system with a "missing" piston. The Ford 3-cylinder engine is said to run very smoothly.
The push downward is of course being driven by mandated fuel-economy standards, and the fact that consumers are now more interested in fuel economy. They still want the bang, but if they can get it for less bucks in fuel costs, all for the good. Smaller engines allow the car to be smaller and lighter, enhancing fuel efficiency further, with efficiency further enhanced by use of seven-, eight- and even nine-speed digital transmissions. Three cylinders isn't the bottom limit, either; Fiat is offering the Fiat 500 in Europe with a tiny 875-cc, two-cylinder engine with remarkable fuel economy.COMMENT ON ARTICLE
* SMART STRUCTURES REVISITED (1): The notion of "smart infrastructure" -- structures wired with sensors to monitor their condition -- has been discussed here, last in 2009. THE ECONOMIST ran a survey on the topic ("Superstructures", 11 December 2010) providing new details.
The collapse of the bridge on Interstate 35W over the Mississippi River near Minneapolis on 1 August 2007 killed 13 people and injured over a hundred. It became a modern textbook study of structural design gone wrong. The bridge had opened in 1967 and wasn't scheduled for replacement until 2020; it had been judged "structurally deficient" in 2005 after an inspection, but other bridges were in worse shape and the belief was that repairs could wait. In 2008, the US National Transportation Safety Board released a report on the accident, judging that it was actually extra concrete that had been added to reinforce the bridge to allow it to handle higher traffic loads that led to its failure.
The problem with infrastructure all over the world is that it tends to be built for the long haul, which means that much of it in service is old, and worse is being subjected to heftier loads than it was originally designed for. For example, British trains routinely operate on arched bridges built in Victorian times; to be sure, the old structures are maintained, but traditional inspections aren't always enough to spot dangerous problems ahead of time. In an era of smart, cheap wireless sensors, can't we do better? A bridge should be able to keep track of its own condition at all times, logging data to a server that will raise an alarm if something's wrong. Ultimately, a bridge could even repair itself, to a degree, without intervention. Such "smart structures" will be comprehensively wired with sensors to measure conditions such as temperature, vibration, and strain. The technology is not at all new, but traditionally it was limited by the need to wire the structure up at considerable cost to provide power and signal paths for the sensors. Now we have wireless and, sometimes, self-powered sensors that can simply be placed where they're needed.
* In a two-month trial performed in the summer of 2006 on the Golden Gate Bridge in San Francisco, a team led by Sukun Kim and Shamim Pakzad of the University of California, Berkeley, installed a network of 64 vibration-monitoring wireless sensors on one of the bridge's towers and across its main span, at a cost of $600 USD per sensor, compared with $4,000-$15,000 per sensor for hard-wired devices. The sensors in the Berkeley experiment relayed data from one to the other, meaning the sensors didn't need high-powered transmitters to communicate. It could take dozens of "hops" for the data from a sensor to get to the central server monitoring the network, but the data packages were small and the data rates low, so the overhead wasn't a problem.
This demonstration study showed the idea was feasible, but it left open the question of whether wireless sensor networks would be practical under harsh field conditions. Another team of researchers, under led by Kenichi Soga of the University of Cambridge, followed up the Berkeley study by testing wireless sensors for nearly three years at three locations in Britain:
As often happens when applying new technology to the real world, the Cambridge researchers ran into a series of problems. On the Humber Bridge, when the sensors were initially placed on the structure, radio reflections caused mutual interference, meaning that a single node could take up to an hour to get onto the sensor network. They constructed software to map out the best locations for sensor placement on the bridge.
There were less subtle problems as well. Many of the sensors installed in the Underground tunnels fell off the concrete walls in a day or so, and better glues had to be found to get them to stick. That solved, within a few weeks the sensors were found to be covered in thick layers of brake dust, requiring fit of protective casings. Some of the sensors also proved unreliable over the longer term. Such difficulties are no surprise in a developmental project, and when the sensor networks were working, they did all that was expected of them. The two bridges turned out to be sound, but the measurements of the Underground tunnel suggested the lining needed replacement in the near future. [TO BE CONTINUED]NEXT | COMMENT ON ARTICLE
* THE KILLING OF JFK -- THE BALANCE OF EVIDENCE (40): It of course must be emphasized that though the evidence damns Oswald for killing JFK and Officer Tippit, that in itself doesn't rule out a conspiracy. Maybe there were others involved, maybe even though Oswald really was the triggerman, he was thrown to the wolves to protect the conspiracy. However, there is no credible evidence for other gunmen, and Oswald's links with supposed conspirators are unpersuasive -- the FBI's conversations with him, for example, none of which suggest any more than the bureau keeping tabs on a citizen who didn't do much to conceal his disloyalty. The links to other groups fingered as players in the assassination are even weaker.
Oswald died with less than $200 USD to his name, and for his getaway he took a public bus; if he was part of a conspiracy, he was being left to his own threadbare resources to carry out his duties. Similarly, it his hard to think of a convincing explanation of why Oswald, having been betrayed by the conspiracy, would not have immediately betrayed the conspiracy in turn to save his own skin. Nothing in Oswald's background suggests he would have voluntarily sacrificed himself for anyone. Besides, as noted he resisted arrest, even pulling his pistol on the cops. Most significantly, Oswald was an antisocial paranoid who had no marketable skills, was barely able to hold down menial jobs, and was generally perceived by those who met him as unpleasant, unstable, and unreliable. Nobody would have regarded such a loser as a reliable agent, and on the other side of the coin he was too insubordinate to tolerate taking orders from anyone any more than he felt he had to.
The underlying question about Oswald is motive. Marina told the Warren Commission that her husband had tended to admire JFK; and Oswald himself told the Dallas police under interrogation that he didn't have a motive to kill JFK, since there was no reason to think LBJ would pursue any different policies. What Oswald's real motives are can't be proven and will forever remain a matter of speculation.
However, it seems easy to believe that a frustrated nobody like Oswald, inclined to violence, saw the killing of JFK as empowerment. Oswald was an ideologue, a half-baked believer in revolutionary utopianism, and JFK's visit to Dallas gave him an opportunity to strike a blow that would go down in history. Again, that's just speculation, but on inspection of the photos of Oswald swaggering in front the camera with his rifle and his pistol, it becomes very plausible speculation: it was a gesture of someone with a very active and unhealthy fantasy life.
The idea that there necessarily had to be a conspiracy behind the assassination doesn't hold water. Presidential assassins, as well as those who have tried to kill presidents, have traditionally been "lone nuts". Although conspiracy theorists like to retroactively imagine conspiracies for assassinations before that of JFK, the only presidential assassin who was known to be part of a conspiracy was John Wilkes Booth, and Booth hardly makes a good model for those who claim Oswald was part of a conspiracy. Booth's conspiracy to kill Abraham Lincoln and other top US government officials was clumsily organized and quickly tracked down. There was no evidence that any government organization, Union or Confederate, knew about Booth's plot beforehand. It should be noted, incidentally, that though some conspiracy theorists point out that Oswald had no record of serious criminal acts up to the assassination, neither did Booth.
In any case, if Oswald was part of a conspiracy, in half a century nobody has found any credible evidence as to its nature, and those who still claim there was a conspiracy disagree widely among themselves. The idea that JFK could have been killed in such a meaningless accident of fate does remain distressing -- but unfortunately it is nothing unusual for people to be killed in meaningless accidents of fate. [TO BE CONTINUED]START | PREV | NEXT | COMMENT ON ARTICLE
* Space launches for October included:
-- 02 OCT 11 / COSMOS 2474 (GLONASS M) -- A Soyuz 2-1b booster was launched from Baikonur in Kazakhstan to put a Russian GLONASS M navigation satellite into orbit. The satellite was designated "Cosmos 2474". This launch brought the GLONASS constellation up to its full 24 satellites. This was the first launch of a Soyuz booster since a launch failure in August.
-- 05 OCT 11 / INTELSAT 18 -- A Land Launch Zenit 3SLB booster was launched from Baikonur to put the "Intelsat 18" geostationary comsat into orbit. The spacecraft was built by Orbital Sciences and was based on the Orbital GEOStar 2 spacecraft platform. Intelsat 18 had a launch mass of 3,200 kilograms (7,055 pounds), a payload of 24 C-band / 12 Ku-band transponders, and a design life of 15 years. It was placed in the geostationary slot at 180 degrees East longitude to provide communications services to the Asia-Pacific region.
-- 07 OCT 11 / W3C -- A Long March 3B booster was launched from Xichang in China to put the Eutelsat "W3C" geostationary comsat into orbit. The spacecraft was built by Thales Alenia Space and was based on the firm's Spacebus 4000 C3 satellite bus. It had a launch mass of 5,440 kilograms (12,000 pounds); a payload of 53 Ku / 3 Ka band transponders, targeted on four regions; and a design life of 15 years. It was placed in the geostationary slot at 16 degrees East longitude to provide communications services to Europe, Africa, and South Asia.
-- 12 OCT 11 / MEGHA-TROPIQUES, SMALLSATS x 3 -- An ISRO Polar Satellite Launch Vehicle was launched from the Indian launch center at Sriharikota to put the Indo-French "Megha Tropiques" Earth environmental satellite into orbit. The spacecraft was designed to investigate the water cycle and tropical climate. It had a launch mass of a tonne (2,200 pounds) and the mission was expected to last three years. "Megha" means "Cloud" in Sanskrit; India built the spacecraft bus, while France provided a water profiling instrument and a radiative emissions mapper. The launch also included three small satellites as secondary payloads:
-- 19 OCT 11 / VIASAT 1 -- A Proton M Breeze M booster was launched from Baikonur to put the "ViaSat 1" geostationary comsat into orbit for ViaSat INC of Carlsbad, California. ViaSat 1 was built by Space Systems / Loral and was based on the SS/L 1300 comsat platform. The satellite had a launch mass of 6,760 kilograms (14,900 pounds); a Ka-band payload with a bandwidth of 15 gigabits per second and 72 spot beams, 63 to cover the US and 9 for Canada; and a design life of 15 years. ViaSat 1 was placed in the geostationary slot at 115 degrees West longitude to provide direct broadband internet access to users in North America.
-- 21 OCT 11 / GIOVE A,B -- A Soyuz 2-1b (Fregat) booster was launched from Kourou in French Guiana to put two "Galileo In-Orbit Validation Experiment (GIOVE)" navigation satellites into orbit. Each spacecraft had a launch mass of 700 kilograms (1,545 pounds); they were built by Thales Alenia Space in Italy. This was the first Soyuz launch from the New World.
-- 28 OCT 11 / NPP -- A Delta 2 7920-10 booster was launched from Vandenberg AFB to put the "NPOESS Preparatory Project (NPP)" polar-orbiting weathersat into orbit for NOAA & NASA. It had a launch mass of 2.27 tonnes (2.5 tons) and included a payload of five instruments:
CERES was the only instrument that was ported more or less directly from earlier spacecraft. The other four instruments were improved new-design versions of earlier instruments. NPP was primarily intended to validate technologies for the future Joint Polar Satellite System (JPSS). NPP's mission lifetime was five years, though the spacecraft had fuel for seven years.
-- 30 OCT 11 / PROGRESS 45P (ISS) -- A Soyuz U booster was launched from Baikonur to put a Progress tanker-freighter spacecraft into orbit on an International Space Station (ISS) supply mission. It was the 45th Progress mission to the ISS. It docked with the ISS Pirs module on 2 November.
-- 31 OCT 11 / SHENZHOU 8 -- A Long March 2F booster was launched from Jiuquan in China to put the "Shenzhou 8" space capsule into orbit. It did not carry a crew, though it did fly recoverable payloads. It performed an automated docking with the Tiangong 1 space station, launched in September 2011. Shenzhou 8 spent three weeks in space, docking with the space station twice.
* OTHER SPACE NEWS: Elon Musk, boss of the SpaceX commercial space booster firm, is now proposing a fully reusable launch vehicle (RLV). As currently envisioned, the RLV looks very much like the SpaceX Falcon 9 expendable booster. However, after launch and second stage separation, the first stage flips around and performs an engine burn for a controlled descent, to land vertically on four pop-out legs. After release of the payload, the second stage uses a heatshield on its forward section to re-enter the Earth's atmosphere, to then use its engines for a vertical landing, just like the first stage.
The history of RLVs is a dismal one, and Musk knows it: "It's just a very tough engineering problem." However: "I've come to the conclusion that it can be solved. And SpaceX is going to try to do it ... If you look at the cost of a Falcon 9, it's about $50 to $60 million. But the cost of the fuel and oxygen and so forth is only about $200,000. So obviously, if we can reuse the rocket, say, a thousand times, then that would make the capital cost of the rocket for launch only about $50,000 ... It would allow about a hundred-fold reduction in launch costs."
It would be interesting to know what penalty would be imposed by adding landing gear and other kit for RLV operation, as well as reserving fuel for controlled landings: 20% would be tolerable, 10% would be attractive. 10% doesn't sound unreasonable: 5% weight gain for the re-entry systems, 5% for the additional fuel -- after all, the stages would be mostly empty when they came in for touchdown.
In related news, as a corrective to the reports of enthusiasm for small satellites run here in the past, the market is so soft that SpaceX is putting its Falcon 1 light booster on the back burner for now. The company's not giving up on it just yet, merely recognizing that the current business model for the Falcon 1 needs to be re-evaluated.
* The "Mars500" exercise, a mission simulation of a Mars expedition performed in Moscow and mentioned here in 2010, finally ended, with the six-man crew emerging after 520 days of isolation. Researchers observing the crew believe Mars500 should provide very useful data about crew mental health and interaction for an actual Mars mission -- if one is ever flown.COMMENT ON ARTICLE
* FLEXIBLE AC TRANSMISSION SYSTEMS: The concept of the "smart grid" has been discussed here in the past. Discussions of smart grids often focus on "smart meters" with wireless connections, but smart grids also involves enhancement of the transmission network to ship power from generation sites to end users. As discussed by an article from IEEE SPECTRUM ("The FACTS Machine" by Peter Fairley, January 2011), much of the potential improvement to the power grid is being provided by a set of technologies known as "Flexible AC Transmission Systems (FACTS)". A power grid enabled by FACTS could reconfigure power flows in real time, making sure power gets to those who need it with minimal losses, as well as accommodate inherently intermittent renewable power sources such as solar and wind.
FACTS is already here to an extent, having been validated in the 1990s through demonstrations led by the power industry's Electric Power Research Institute (EPRI) and grid equipment manufacturers, such as General Electric (GE) and Zurich-based ABB. Over the past decade, FACTS has gone commercial and, as an ABB official puts it, is "penetrating the network everywhere."
* Basic elements of FACTS actually emerged in the 1970s. AC electrical circuits handle two components of electricity, which can be labeled "active" and "reactive" power. Active power is the familiar watts consumed by lightbulbs, toasters, personal computers, and other effectively "resistive" loads; it's the power that actually does work, defined by the product of voltage and the component of an alternating current in phase with the voltage.
Thanks to the presence of "reactive" circuit elements -- inductors and capacitors -- some of the current will not be in phase with the voltage. This out-of-phase component multiplied by the voltage gives the reactive power, which is measured in "volt-amperes reactive (VAR)", or more commonly, megavars. Inductors cause current to lag voltage, resulting in "negative reactive power", while capacitors cause current to lead voltage, resulting in "positive reactive power". Reactive power cannot be used to do work -- it's power stored by the reactive elements and then released back into the circuit -- and amounts to a nuisance, since the system still has to provide the additional power in the first place. Negative reactive power is the real problem, since unwanted inductive effects tend to build up more strongly in a power grid than capacitive effects, and if the buildup gets too great it can cause voltage levels to "sag", possibly damaging loads and even leading to a "domino effect" that can take down a power grid.
Power utilities compensate for negative reactive power caused by inductances by adding positive reactive power into the system. Traditionally there are two ways of doing this: by switching banks of capacitors into a circuit to convert some of its megawatts into megavars, or by tuning the generators in conventional power stations to produce current waveforms that lead voltage.
FACTS got started as a smarter, more dynamic solution, and it has become increasingly important as deregulation has turned the electricity business into a sort of electrical anarchy, where supply and demand have a continuously shifting relationship. FACTS is smart enough to manage that relationship, allowing more power to be sent over lines than they could be judged to otherwise safely support, with the increment ranging up to 50%.
The core of a modern FACTS controller system is an array of solid-state switches, often coupled to capacitors. Typically, the solid-state switches open to tap power from the line and charge a capacitor; the switches then fire in sequence to create a synthetic AC waveform with precisely the needed phase difference between current and voltage, with that waveform driven into the grid. By precisely varying the phase difference, the FACTS controller can add or subtract reactive power in fine increments.
Such "dynamic voltage regulation" is slick, but FACTS can do more. In the 1990s, FACTS developers made use of new high-power semiconductor switches, such as "insulated-gate bipolar transistors (IGBT)", that could switch at frequencies higher than the standard 50 Hz / 60 Hz AC rate. Using such fast switches, FACTS controllers could simultaneously regulate voltage and trim glitches out of the AC signal. One such FACTS device, the "static synchronous compensator (STATCOM)", has played an important role in the more than tenfold rise in wind power capacity worldwide over the past decade. A Siemens-built STATCOM, for example, is stabilizing the power output of the world's largest offshore wind farm, completed in September 2010, whose 100 three-megawatt wind turbines feed enough energy to the United Kingdom's grid during the year to supply more than 200,000 homes. Even just a few years ago, power transmission engineers were intimidated by the gusty, noisy power output of big wind farms, but now they're seen as ordinary big power projects.
FACTS also makes it easier to ship AC electricity over long distances. In 1998, Brazil commissioned a pair of 1,000 kilometer (620 mile), 500 kilovolt lines to link its northern grid, charged with Amazonian hydropower, to the southern grid serving its coastal population centers. A FACTS controller near the northern end of the line simultaneously drives power and damps destabilizing feedback signals. Nobody had ever put up AC trunk lines that went that far before; high voltage direct current (HVDC) are actually more efficient for long-haul lines, but the Brazilians needed to tap the trunk lines for communities along the way, which would have been difficult with HVDC. Two other developing giants, China and India, are also interested in FACTS as a way of reducing the cost and complexity of electrical grid development.
* FACTS is still a work in progress. In 2003, EPRI and the New York Power Authority demonstrated how two FACTS controllers could draw electricity off one line and drive it down another, guiding power around roadblocks in the grid. However, using FACTS to improve power transitions between meshed grids will demand greater efficiency, as well as more recognition of the value of FACTS. Grid experts believe both are going to happen.
Efficiency will improve as FACTS manufacturers adopt silicon carbide semiconductor switches -- discussed here recently. Currently, high-frequency silicon IGBT devices used in sophisticated FACTS controllers are no more than 94% efficient, meaning not only that they can waste megawatts of power, but they also need active cooling to keep from burning themselves up, being unable to handle temperatures greater than 110 degrees Celsius (230 degrees Fahrenheit). Silicon carbide IGBT devices are 96% efficient or better and can operate at 200 degrees Celsius (390 degrees Fahrenheit), reducing wastage and overhead. Silicon carbide devices should also make HVDC conversion systems more efficient.
Interest in FACTS continues to grow. Building the smart grid is still not going to be easy, not merely because of the difficulty of getting approval for new power lines, but because the future power grid is necessarily going to more complicated than it is now. FACTS promises to be an important element in getting it to work.COMMENT ON ARTICLE
* DOES CANCER SCREENING WORK? While screening for cancer has become a tradition in medical checkups, as reported by an article from THE NEW YORK TIMES ("Considering When It Might Be Best Not To Know About Cancer" by Gina Kolata, 29 October 2011), there's a growing realization in the medical community that the payoff for cancer screening isn't as big as had been assumed, and in fact it can be counterproductive.
Expert groups are now proposing less screening for breast, cervical, and prostate cancer. Two years ago, the influential United States Preventive Services Task Force (PSTF), which evaluates evidence and publishes screening guidelines, said that women in their 40s do not appear to benefit from mammograms, and that women ages 50 to 74 should consider having them every two years instead of every year. This year, the PSTF said the widely used PSA screening test for prostate cancer does not save lives, it actually does more harm than good; the group also concluded that most women should have Pap tests for cervical cancer every three years instead of every year.
The change in mindset was due to new clinical trials and analysis of medical data. Two recent trials of prostate cancer screening threw doubt on whether many lives, if any, are saved. Such screening has often led to expensive and troublesome courses of treatment for men who really weren't at much risk. Similarly, a new analysis of mammography concluded that while mammograms find cancer in 138,000 women each year, as many as 120,000 to 134,000 of those women either have cancers that are already terminal, or have cancers that grow so slowly they do not need to be treated.
Over the past ten years, cancer experts have become aware of a growing body of evidence to show that while early detection through widespread screening can help in some cases, those cases are small in number for most cancers, and don't compare with the problems due to misdiagnosis. Unfortunately, screening has momentum behind it. Specialists such as urologists, radiologists, and oncologists, who see patients dying from cancer, often don't like the idea of doing less screening. Giving up, or at least scaling back, on screening is also tacit admission of the truth that if patients don't notice something wrong with themselves, doctors aren't in a vastly better a position to find something wrong either, and doctors can be reluctant to throw up their hands and admit defeat. General practitioners, who may agree with the new guidelines, worry about having to explain to patients why they shouldn't have a mammogram every year, or a PSA test at all.
And then there's an ugly legal trap: patients who are not screened and then develop cancer might press lawsuits on doctors, even when the odds that the screening might have helped them are too low to bet on. Patients also tend to suspect that the de-emphasis on screening is really all about cost-cutting and not about delivering proper medical treatment. However, despite these hazards, medical researchers are now becoming outspoken about the need to kick the screening habit. According to Otis Brawley, chief medical officer of the American Cancer Society (ACS): "No longer is it just: 'Can you find the cancer?' Now it is: 'Can you find the cancer, and does finding the cancer lead to a decrease in the mortality rate?'"
Besides, cost is a concern for patients as well as the medical community. Even under the best of circumstances, there's only so much money to go around, and money being spent on ineffective medical screening is money not being spent on medical treatments that work. Since the medical profession is ultimately, one way or another, funded by patients, that gives patients an incentive to get the best treatment for their money.
Still, the changing message has created confusion. People have, for good reason, a terror of cancers, but it is becoming clear that we only notice cancers when they cause trouble -- and they don't always cause trouble. Researchers are increasingly realizing that many cancers grow very slowly, or just stop growing; some even regress on their own without treatment. It has become a conventional wisdom that the best way to deal with cancer is to catch it early, but it's now it's apparent that was a simplistic view of things. Says Brawley: "I think people are actually starting to understand that we need to be a little more rigorous in what we accept about screening. I do sense there is some movement there."
ED: On snooping around the internet relative to this article, I ran across websites talking about the "cancer conspiracy", describing how the medical industry is exploiting worries about cancer to peddle expensive treatments while deliberately suppressing revolutionary cancer treatments that actually work. These attacks on mainstream medicine are often linked to promotion of alternative cancer therapies -- why am I not surprised?COMMENT ON ARTICLE
* FEEDING THE NINE BILLION (8): As a footnote to this series, as discussed by an article from BUSINESS WEEK ("Can African Farmers Learn To Thrive?" by Alan Bjerka, 1 August 2011), international food aid is important for helping regions afflicted by famine, but there is a tension in the effort. Traditionally, wealthy countries bought surplus crops from their own farmers -- usually from giant traders such as Archer Daniels Midland (ADM) -- and shipped them off to where they were needed. There was no reason to object to such practices on the face of it, since farmers got paid and people got fed. The difficulty with this approach is that such practices undercut farmers in undeveloped nations.
In the 1990s, the United Nations World Food Program (UN WFP), the world's biggest food aid organization, began to buy food from regions closer to famine-struck areas, with the food usually supplied by large African or Asian agribusinesses. In 2007, the WFP came under the direction of Josette Sheeran, previously a US State Department official in the Bush II Administration, who got to wondering how the WFP could help promote farming in undeveloped countries. The result was a program named "Purchase for Progress (P4P)", which has been very successful.
Under the P4P experiment, small farmers get a guaranteed customer and a clearly set price, something US growers have long enjoyed under the crop futures exchange system. P4P staffers provide guidance as the WFP contracts to buy grain and other crops from the small farmers. The contracts allow the farmers to take out loans to buy new seed and fertilizers, the farmers knowing they will be able to pay back the loans. The farmers are also counseled on how to sell their goods to other buyers such as local hospitals and schools. The ultimate goal of P4P is to put itself out of business as farming in undeveloped countries becomes more sophisticated.
The UN says that the five-year pilot program has, at its halfway point, given more than half a million farmers in 20 countries lessons in boosting yields and obtaining credit, and has helped put $30 million USD in the pockets of poor farmers. In Uganda, farmers received one-third more income due to improvements in their corn crop; in South Sudan, food from small farmers helped sustain refugees fleeing the attacks of paramilitary groups. P4P has received US government funding since the program kicked off in 2008, when an abrupt rise in global food prices raised international concern over the threat of widespread hunger; the USA has committed over $35 million USD to the program to help buy food from local farmers. Other donors include the governments of Canada and several European countries, the Bill & Melinda Gates Foundation, and the Howard G. Buffett Foundation.
However, in a time of budget cuts, funds for P4P obtained from the US government are certain to be constrained. Aggravating the pinch is the fact that money being paid out to local farmers means less money for American farmers, as well as traders and shippers. American farm lobbyists say the traditional food aid model does work well, and in fact thanks to forward-based stockpiles and improved distribution mechanisms it's becoming more effective. While Sheeran says that using local farmers will never completely displace direct food aid, even the US Department of Agriculture has estimated a significant shift to local farming could cost American agribusiness a billion USD. As a result, Congressmen from farm states have been digging in their heels over P4P.
Sheeran has her supporters in Congress as well, but it promises to be an uphill fight. Already Congress has authorized a 20% cut in funding. That P4P should struggle in an era of financial hardship is not surprising; that it might wither and die over the long run, however, would be tragic.
* As a lighter comment to tie off this series, THE ECONOMIST online blogs had an essay on the African cashews business. OK, cashews, even though they're nutritious, don't sound like they represent much of the solution to world hunger -- but they're a global business worth hundreds of millions of dollars, and a significant component of African agribusiness. Africa grows 40% of the world's cashews; unfortunately, only 10% of the cashews are actually processed in Africa, with raw nuts more commonly shipped to India and Vietnam, with India being the biggest single market for cashews and Vietnam the top producer of cashews.
A business group named the African Cashew Alliance wants to change the equation by encouraging development of local processing. To this end, an Indian firm named Rajkumar Impex of India -- the world's biggest cashew processor, handling about 10% of the global crop -- is investing in Africa. Venkatesan Rajkumar, boss of the firm, wants to process about 18% of the global crop by 2014, and to help reach this goal he is now completing a factory at Techiman in Ghana. The factory, which cost $9 million USD to set up, is state of the art, able to process up to 50 tonnes of cashews a day. Rajkumar believes he will get ahead by setting up African factories to eliminate the need to ship cashews to India for processing, instead shipping them directly from Africa to consumer markets. He plans to set up more plants in Africa.
Locals are expected to benefit as well. The factory will employ 1,000 people, most of them women, working for up to $4.00 USD a day, along with food -- good pay by local standards. Rajkumar believes farmers will also get a better deal, since traditionally they sold their product through middlemen, who took their cut. By dealing directly with farmers, Rajkumar believes he will be able to pay less while the farmers get more. Since infrastructure in Ghana is weak, Rajkumar is spending another $9 million USD on a biomass-to-electricity plant, using nut waste and other feedstocks for fuel. The factory will only need a fraction of the electricity, with the rest available to the local community. The government will have to set up the power distribution system, but as African countries go, Ghana is reasonably well governed. [END OF SERIES]START | PREV | COMMENT ON ARTICLE
* THE KILLING OF JFK -- THE BALANCE OF EVIDENCE (39): On consideration, the evidence against Oswald becomes overwhelming:
Arguments have been raised against all of these items, but as discussed none of the objections amount to anything more than red herrings at best. The case against Oswald is very strong; the only sensible arguments over it can be over how close to airtight it actually is. If a jury couldn't find Oswald guilty, there's hardly a murder case in history that could yield a conviction.START | PREV | NEXT | COMMENT ON ARTICLE
* SCIENCE NOTES: Modern whales are divided into two distinct groups: the toothed whales, including dolphins, killer whales, and sperm whales which, as their name suggests, have jaws full of teeth; and the "baleen" whales, which have a jaw full of comblike "baleen" structures that allow the whales to gulp in quantities of krill and then strain out the water so the krill can be gulped in mass. The jaw of baleen whales is also disconnected at the front, allowing it to be open wide to scoop in the krill.
The fossil record of the toothed whales is relatively detailed, but there's not much known about the early history of the baleen whales. An Australian paleontologist named Erich Fitzgerald has published a study the fossil remains of a whale named Janjucetus hunderi, dating back about 25 million years, that he characterizes as an "intermediate form" between toothed and baleen whales.
Janjucetus was about the size of a dolphin but it had very undolphin-like jaw -- not open at front, but very wide. Fitzgerald believes that the whale ate by "hoovering", sucking in water and fish, then chomping down on them. It is possible that Janjucetus represents a dead-end branch and that baleen whales actually evolved by some other path, but the example it presents is very suggestive.
* Creationists like to mock the idea that fish could have emerged onto land, but in reality there are a number of species of modern fish that Dike to spend a good part of their time out of the water -- for example the walking catfish, the tree-climbing perch, and the froglike mudskipper. Another fish that likes the land is the "Pacific leaping blenny (Alticus arnoldorum)" of the islands of Micronesia. It lives in the rocky shore intertidal zones, having no problems surviving in the open air -- it actually avoids going into the water -- though it has to stay wet to keep alive. The blennies are very agile on land, able to move quickly over rocky surfaces with a tail-twisting motion, using enlarged pectoral and tail fins to cling to surfaces. If they're in a hurry, they can also twist their bodies and flick their tails to leap many times their own body length.
At high and low tide, the blennies hide out in rock crevices, only emerging at mid-tide to feed, socialize, and breed. Observations show that males are territorial, using elaborate visual displays to warn off rivals and attract mates. Females aggressively defend feeding territory at the start of their breeding season, while males display a red-colored fin and nod their heads vigorously to attract females to their energetically defended rock holes. Females will inspect a hole before entering it with a chosen mate. Not much is known about their breeding and development of their young, but it appears that females lay their eggs in a chosen rock hole and then play no further role in parenting, leaving the male to guard the eggs.
* As reported by BBC WORLD Online, the search for the cause of chronic fatigue syndrome (CFS) -- a condition in which sufferers feel ailing and run-down on a continuous basis -- has gone on for decades without success. Now the frustration of CFS sufferers seems to be boiling over, with British medical researchers working on CFS finding themselves the targets of public complaints to oversight bodies such as the UK General Medical Council (GMC), of abuse on websites, and worse. Says Professor Simon Wessely of King's College London: "It's direct intimidation in the sense of letters, emails, occasional phone calls and threats. But more often indirect intimidation through my employer or the GMC."
CFS victims are particularly angry at the suggestion that their affliction is psychological, that in effect they're just making it all up. The anger is understandable, but has gone beyond reason. As discussed here previously, there was hope that the cause of CFS has actually been found in the form of the XMRV retrovirus, but further studies have failed to confirm the link. Professor Myra McClure of Imperial College, London, who led one of the teams that couldn't validate the connection between CFS and XMRV, has since been rewarded for her work with a flood of denunciations and threats.
McClure finds the hostility baffling. After all, if XMRV is shown to have nothing to with CFS, then researchers will simply stop wasting time on it and focus on better avenues of investigation that might pay off. Besides, XMRV was only the latest in a list of pathogens that were suspected to cause CFS, but didn't -- and so it's not really a big deal to find out XMRV isn't the cause, either. McClure said: "It really was quite staggeringly shocking, and this was all from patients who seemed to think that I had some vested interest in not finding this virus. I couldn't understand, and still can't to this day, what the logic of that was. Any virologist wants to find a new virus."
McClure says she's moving on to other studies, and she's not alone. The argument over the link between XMRV and CFS has become bitter and wearying, with many researchers involved finding it a drain on resources and their careers, even without the added unpleasantry of mindless threats. One said that most who have been working on CFS would "rather go over Niagara Falls in a barrel" than work on CFS again.COMMENT ON ARTICLE
* DUST FACTOR: As the debate over global warming has clearly demonstrated, modeling the Earth's climate is a tricky business, the models being subtly dependent on a wide range of factors. An article from THE ECONOMIST ("A Fistful Of Dust", 8 January 2011), examined one factor that hasn't been given much attention until recently: dust.
On 26 May 2008, winds blowing in from Africa brought clouds of iron-rich red dust to the skies over Germany, leaving a ruddy film on cars and buildings -- and, as determined by researchers at the University of Karlsruhe, dropping the temperature by about a quarter of a degree Celsius until the skies cleared. Germany tends to be more damp than dusty, but in regions such as the Sahara dusty skies are nothing unusual -- with such locales throwing up dust that, as observed by satellites and other systems, can travel thousands of kilometers to influence regional weather, the global climate, and even the growth of forests on the other side of the planet.
The influence of dust is not necessarily malign. African dust is thought, for example, to stimulate plant growth in the Amazon by bringing in phosphorus, which is in short supply there, boosting the rain forest's ability to soak up carbon dioxide. Similarly, parts of the seas are short on iron; dust from the Gobi desert seems to stimulate plankton blooms in the nutrient-poor waters of the North Pacific, though it isn't clear whether this results in a net reduction of atmospheric carbon dioxide, since that would require some of the plankton to sink to the seabed.
Dust in the air, as the Germans found out in 2008, cools the land below. It does this directly, by reflecting sunlight back into space, and indirectly, by helping clouds to form. It is a significant effect, countering about 10% of the warming effect of emissions -- but it's patchy, and in some circumstances it may be unhelpful. Dust that cools a desert can change local airflow patterns and reduce the amount of rain that falls in surrounding areas -- drying them out and making wildfires more likely, increasing the atmospheric CO2 level.
In other words, the impact of dust on global climate, though clearly significant, is poorly understood. Natalie Mahowald of Cornell University and her colleagues have been working towards a better understanding, analyzing sample cores taken from glaciers, lake bottoms, and coral reefs to understand the history of global dust fallout. They then used models of global wind circulation to determine the rise and fall of various dust sources over time. The conclusion of the research is that over the past century, the air has become twice as dusty. Part of the increase is due to human activities, for example from construction or from clearing land for farming. Global warming may also be shifting the boundaries of deserts and increasing dust production in some areas.
Other research suggests that modeling the effects of dust is complicated by the fact that the amount of dust being injected into the atmosphere has been traditionally underestimated. Jasper Kok of the US National Center for Atmospheric Research in Boulder, Colorado, suspects that the amount of coarse dust driven into the atmosphere by wind is at least double and may be eight times as much as previously thought. Kok bases this conclusion on models of how loose soil is blown into the air by the winds. Kok points out that traditionally, atmospheric dust measurements have been oriented towards picking up fine dust -- but in his models, coarse dust is a much more significant component than previously believed.
What to make of dust and its impact on climate? That's hard to say just yet. What is clear is that the issue is yet another example of how fiendishly complicated climate really is.
* SOLAR UV & CLIMATE: Critics of climate-change scenarios have often suggested that climate shifts are actually driven by changes in solar activity, though for the most part the data hasn't shown that to be so. However, a new study hints that solar variability does have some influence on climate.
The sun's activity rises and falls on an 11-year cycle, and over this cycle the level of ultraviolet (UV) light the Sun emits changes a lot more than does the total amount of energy -- which doesn't actually change significantly, we'd certainly notice it the hard way if it did. The stratosphere, the part of the Earth's atmosphere which does most to absorb UV, might be expected to be particularly sensitive to the cycle. A paper recently published by researchers at the UK's Meteorological Office discussed how the Met's new digital climate model described winters at times of high UV and times of low UV, obtaining UV data from the NASA "Solar Radiation & Climate Experiment (SORCE)" smallsat, launched in 2003.
At low UV levels, according to the paper, the stratosphere in the tropics was cooler, there being less UV energy to absorb, which meant the difference in temperature between the tropical stratosphere and the polar stratosphere shrank. That altered atmospheric circulation, and as those changes spread down into the lower atmosphere, they made it easier for cold surface air from the Arctic to flow south in winter, freezing sections of northern Europe. These conditions looked similar to those seen in the past two cold European winters -- which occurred at a time of low solar activity. The Arctic itself, in models and in the real world, was warmer than usual, as were parts of Canada. In contrast, northern Europe, swathes of Russia and bits of America were colder.
Why hadn't this effect been seen before? To an extent it had, since earlier modeling efforts of an extended period of low solar activity in the 17th and 18th centuries showed similar effects. Models of contemporary climate hadn't, since they made conservative estimates of solar UV variation -- unrealistically conservative, according to the SORCE data. Some researchers believe the SORCE data is misleading and the climate patterns have other explanations, but the Met paper is obviously credible, and the critics will have to be credible in return.
Climate-change skeptics may be inclined to dismiss the Met paper as a high-tech fairy tale; that would be exactly missing the point. Predicting the future of climate change is tricky, of course, but the real lesson of the Met paper is that the study took high-quality data, ran in through computer models, and then compared the results to the real world -- to find they matched. It then becomes more difficult to argue that climate researchers don't know what they're doing.COMMENT ON ARTICLE
* THE NEW MOON RACE: While the US space program seems stalled for the moment, as reported by an article from THE NEW YORK TIMES ("Race To The Moon Heats Up For Private Firms" by Kenneth Chang, 21 July 2011), the nation's commercial spaceflight community appears very active. Encouraged by a $30 million USD prize put up by Google, 29 teams have signed up for a competition to become the first private venture to land a robot probe on the Moon. The last time anyone soft-landed a probe on the Moon was in the 1970s. The perception is that most of the teams won't be able to overcome the financial and technical obstacles to meet the contest deadline of December 2015, but several teams think they have a good shot to win, and get a leg up on future commercial space business.
The Google Lunar X Prize echoes back to the aviation prizes that helped jumpstart airplane technology a century ago. It follows an earlier X Prize competition to build a spacecraft that could carry passengers on suborbital space flights; aircraft designer Burt Rutan, with financial backing from software magnate Paul Allen, won the contest with his "SpaceShipOne", picking up a $10 million USD award from the X Prize Foundation. For the Moon competition, Google put up $30 million USD. Of that, $20 million USD will go to the first team to land a spacecraft on the Moon, explore 500 meters (1,640 feet), and send back high-definition video and photos. The second-place team will win $5 million USD, with remaining $5 million USD being handed out for bonus prizes like surviving a frigid lunar night or traveling more than 5 kilometers (3.1 miles) meters on the lunar surface.
A Silicon Valley startup named Moon Express is offering to deliver "parcels" from customers to the Moon, the aspiration being something along the lines of a cosmic FedEx. The founder of Moon Express, Naveen Jain -- who made a fortune as an internet entrepreneur, founding the startups Infospace and Intellius -- says his firm will spend up to $100 million USD to get to the Moon. That seems like a bit much for a $30 million prize, but most engaged in the Moon race regard the prize as too little to fund a Moon flight anyway; the prize is effectively just an incentive to ease getting into a commercial spaceflight niche that will hopefully offer profits elsewhere. Jain believes that Moon Express could recoup its investment on the first flight. He envisions selling exclusive broadcast rights for video from the Moon, as well as sponsorships, as with racecars, for companies to put their logos on the lander. There could be a tie-in with reality TV as well.
Another competitor, Astrobotic Technology, intends to sell cargo slots on its Moon lander to space agencies and scientific institutions at a price of $1.8 million USD per kilogram. The company, a spinoff from Carnegie Mellon University and located in Pittsburgh, Pennsylvania, is building a relatively large Moon lander that would be able to carry almost 110 kilograms (240 pounds) of payload, worth $200 million USD to the company. Astrobotic Technology hopes to be ready for launch by the end of 2013. Says company president David Gump: "We will be making substantial profit on the first flight. Basically, we'll break even by selling a third of the payload."
The national space agencies of China, Russia, and India are also working on Moon landers and rovers, but their spacecraft are in a different league, being traditional complex, expensive science probes. The US National Aeronautics & Space Administration (NASA) had been planning to send astronauts back to the Moon under the Constellation program, but Constellation was axed in 2010, a victim of budget shortfalls. However, NASA has been very cooperative with the commercial firms working on the Google Moon X Prize, awarding seed money of a half million dollars each to Moon Express, Astrobotic, and a third competitor, Rocket City Space Pioneers.
The contestants do not appear to face legal obstacles. The Outer Space Treaty of 1967, ratified by 100 nations, prohibits countries from claiming sovereignty over any part of the Moon, but does not prevent private companies from setting up shop. As for mining the Moon, it could fall under similar legal conditions as fishing in international waters. Most of the contenders are enthusiastic about the potential; says Barney Pell, previously a NASA researcher and a co-founder of Moon Express: "It's probably the biggest wealth creation opportunity in modern history. Long term, the market is massive, no doubt. This is not a question of if. It's a question of who and when. We hope it's us and soon."
A skeptic might notice that such enthusiasm for commercial exploitation of the Moon sounds suspiciously like the wild schemes of the dot-com bubble of a decade ago, and some working on the Lunar X-prize are more cautious. Rocket City Space Pioneers, backed by a consortium of aerospace businesses, more practically sees the Moon lander as only a means to an end. The company's goal is to develop and market a payload carrier to handle small secondary payloads launched on a large booster carrying a heavy primary payload. As envisioned, the Rocket City "Rideshare" system will consist of a stack, with:
Given the big interest in small spacecraft missions, discussed here some months back, Rocket City doesn't need to bank on futuristic visions, there being plenty of business opportunities for their technology in the here and now. Tim Pickens of Dynetics, a company in Huntsville, Alabama, that's leading the Rocket City effort, says going for the Moon is "so expensive", and some conservatism is in order: "We don't know what the market is for sure in its entirety."COMMENT ON ARTICLE
* FEEDING THE NINE BILLION (7): In earlier installments of this series, the challenge of feeding the world in 2050 was presented in terms of quantity. There is, however, the issue of quality -- not just enough food, but the right sort of food. Most people get enough calories, but nutrient deficiencies are more common than they should be, most notably deficiencies in iron, zinc, iodine, and vitamin C. Iron deficiency makes over 1.5 billion people anemic, including half of all women of child-bearing age in poor countries, while a lack of vitamin A causes up to half a million children to go blind each year, with half dying within a year. Children who survive may be sickly and intellectually impaired as adults. Even the well-fed are not always well-served by their diets, with obesity common in wealthy countries, and now spreading to less developed nations, such as Mexico and Guatemala. The elderly in rich countries also need more calcium and vitamins, and don't always get them.
Solving the nutrient issue is more troublesome than it might seem. While programs to distribute vitamin supplements are common, they're not particularly effective in reaching the rural poor, and have a poor track record. There is also the fact that nutrition isn't necessarily much of a consideration for consumers; compounding the ignorance factor are the energetic efforts of quacks promoting eccentric diets that supposedly provide a remarkable range of benefits, in the most extreme cases even curing AIDS. Manufacturers of dietary supplements may promote them without much concern for their actual usefulness, while mainstream processed-food vendors have a tendency to promote the "healthiness" of their products on the basis of \ vaporous claims.
Better nutrition, in short, isn't just a matter of handing out diet sheets and expecting everyone to go along. To get things to work requires education; supplements; fortifying processed foods with extra vitamins; and breeding crops to incorporate extra nutrients. Not least, there's also the issues of making things that aren't too expensive and people want to eat. US food manufacturer Kraft made a success of "Biskuat" -- an "energy biscuit" loaded with extra vitamins and minerals -- in Indonesia by selling it at an equivalent of 5 cents a packet. Kraft also did well in Latin America with Tang, a sweet powdered drink with added nutrients, marketing it to children for the taste and mothers for its nutritional value.
It's also possible to breed plants that contain more nutrients. An organization named HarvestPlus recently introduced in Uganda and Mozambique a new sweet potato variety with more vitamin A. It has proven popular and done much to reduce vitamin deficiencies. HarvestPlus has a pipeline of other "biofortified" crops:
However, it's difficult to come up with plants that can provide the full proper daily dose of nutrients. Workers in the field believe it is important to target the most critical deficiencies and the most vulnerable, particularly infants. None believe the road ahead will be easy, since the problem is not merely one of supply, but also of cultural habits.
* In sum, this survey has suggested that agriculture is confronted with a massive challenge in ensuring that the 9 billion souls on Earth in the year 2050 will have enough to eat. Constraints on land, water, and fertilizer suggest not; modernized farming methods, better infrastructure to reduce waste, and genetic technology say yes. In any case, the struggle to produce more food is almost certain to continue to lead to price surges along with political instabilities; emerging economies like Brazil that are major food exporters are likely to find the shifting order to their advantage. Climate change is similarly likely to strain the system, not only by dislocating traditional agricultural practices but by also altering what farms grow: will crops be planted with an eye towards sequestering carbon, or to produce biofuels?
The immediate future poses both threats to and opportunities for agriculture. As grave as the threats are, however, the promises seem brighter. To be sure, agriculture isn't doing a perfect job now, with about a seventh of the world's population going hungry -- but has it not always been so, that some live in plenty while others starve? Driven by genetic technology and the digital intelligence of 21st-century technology, there is the potential that the global order of 2050 may achieve something unprecedented: a planet where nobody goes hungry. The technical breakthroughs, however, will not be enough if there isn't the political and social will, along with the associated resources, to make it happen. [TO BE CONTINUED]START | PREV | NEXT | COMMENT ON ARTICLE
* THE KILLING OF JFK -- THE BALANCE OF EVIDENCE (38): A final loose thread in the Tippit shooting was the issue of where Oswald was going after he left the boarding house. As noted, nobody knows for sure; and since he had less than $20 USD on him at the time, it's hard to think he could have got very far. The most plausible conjecture is that he was trying to get a local bus to take him to a stop where he could pick up a bus to take him out of town. However, that's pure speculation, and a range of other destinations has been proposed: Jack Ruby's apartment, about another half-hour's walk straight to the south; General Walker's house, about two hours walk south, the idea being that Oswald was planning to finish off the general for good; or maybe to some "hookup" with the conspiracy.
Among the candidate locations for such a "hookup" was the Redbird Airport, much later renamed the Dallas Executive Airport, a little municipal airport in south Dallas, which Oswald could have reached by picking up a bus. The idea is that Oswald planned to catch an airplane at the Redbird Airport that would take him to Mexico. Conspiracy theorists claim that a plane bound for Mexico did depart the airport that afternoon, and had been sitting there with its engine idling until Oswald's arrest was announced. A variation on this story says that the Dallas police impounded a plane there in the course of their investigation.
This notion got its first play in Richard Popkin's 1966 book THE SECOND OSWALD, with Popkin citing a story told him by conspiracy theorist Jones Harris. Harris said he had spoke with Wayne January, an airport manager, who said that on Wednesday, 20 November 1963, a car with three people in it drove up, with a young man and woman getting out to ask Harris if they could rent a small plane on Friday, 22 November. January didn't like their looks and refused; he commented that he was sure the third person, who stayed in the car, was Lee Harvey Oswald.
The Dallas police had actually checked the records of Redbird Airport in the wake of the assassination of JFK, looking for suspicious arrivals and departures, and come up zeroes; they certainly didn't impound an aircraft as part of the investigation. Furthermore, a week after the assassination January told much the same story to the FBI, but placed the time of the inquiry in July, said the two visitors didn't specify when they wanted the plane, and wasn't sure the fellow left in the car was Oswald. January kept telling the story again over the years, though it had an odd tendency to change over time.
* Another figure with a story to tell about Redbird Airport was William Robert "Tosh" Plumlee AKA William H. "Buck" Pearson. In 2004, Plumlee claimed that in 1963, he was a contract pilot working for US government "spook" organizations, and that he had been assigned on 20 November 1963 to fly a special team into Redbird Field on 22 November. It seems that the authorities had got wind of a plot against JFK's life, and the team was to be sent in to protect him.
Plumlee flew the team into Redbird Airport at midmorning on 22 November. The team consisted of mobster Johnny Roselli, several folks Plumlee judged to be New Orleans gangsters, and a number of anti-Castro Cubans -- likely the most eclectic presidential security detail in American history. On an invitation from a Cuban named "Sergio", Plumlee tagged along with the team to Dealey Plaza, where they looked for possible assassins. They failed, with Plumlee hearing "4 or 5 shots" ring out and JFK being killed. The team went back to Redbird Field and Plumlee flew them out that afternoon. There was no record of the flight at Redbird Field, but Plumlee said that was because he didn't file a flight plan. He also didn't bother to go public with his story for over 40 years. [TO BE CONTINUED]START | PREV | NEXT | COMMENT ON ARTICLE
* GIMMICKS & GADGETS: A wide range of feedstocks have been envisioned for biofuels; now researchers at Tulane University in New Orleans have come up with an imaginative feedstock, in the form of old newspapers. They identified a bacterium designated "TU-103", of the Clostridium genus -- which includes "bad actors" such at the bacterium that causes botulism poisoning -- that can break waste cellulose down into butanol. Butanol has higher energy density than ethanol, is easier to handle, and isn't corrosive; while there are other bacteria that can generate butanol, TU-103 is unusual in that it can deal with the presence of oxygen, which kills its rivals. The process is still experimental but seems promising.
* A startup company named Pythagoras Solar is now setting up a pilot installation of their "Building Integrated Photovoltaic (BIPV)" system on the 56th floor of the Willis Tower -- previously the Sears Tower -- in Chicago. The BIPV system looks more or less like a conventional window, but it includes a prismatic system that allows diffuse light to pass through, while directing optical energy to PV cells at the bottom of the window. Pythagoras officials claim that the concentrator scheme is highly efficient, and that the diffuse light provides better interior lighting and climate control than from ordinary window panes. It certainly seems like a very attractive scheme for "glass box" skyscrapers.
I also ran across references to transparent PV cells that could be used to obtain power from window installations; other possible applications include e-readers and other handheld devices with displays. It appears that people have been tinkering with this such for some years, but this is the first I've heard about it. Details are obscure, but it seems that a typical scheme is a sandwich of glass with transparent tin oxide electrodes on the inner faces of the glass, and titanium dioxide nanoparticles coated with photoelectric dye in the interior as the "active medium", with an electrolyte used to carry current to the electrodes. Developers claim conversion efficiencies on the order of 10%, though apparently the transparent cells block half the light into the building.
* It's an annoying fact of life that much of the electronic gear in our homes draws electrical power all the time, whether it's doing anything or not -- the matter having been discussed here in 2006. Part of the problem is that manufacturers are only concerned with the lowest sales price for the gear they sell, and if it ends up being a "vampire" that drives up a buyer's electricity bill, that's not their problem.
As reported by NEW YORK TIMES, the set-top boxes that provide access to cable or satellite video entertainment are among the worst offenders -- made even worse when they're coupled to power-hungry digital video recorders (DVRs). A study showed that a set-top box with a DVR gobbled up about 446 kilowatt hours (kWh) a year, about 10% more than an average energy-efficient refrigerator. There are about 160 million such set-top boxes in the USA and the number is growing. The problem is that they are "always on", completely alert 24 hours a day, burning up three times more electricity than their usage actually requires. The sum of this wasted power exceeds the power used by the state of Maryland.
That needn't be so. Similar devices in some European countries, for example, can automatically go into standby mode when not in use, cutting power draw in half. They can also go into an optional "deep sleep," which can reduce power consumption to 5% of normal operational power draw. US electronics firms and video service providers haven't worried much about the power drain, seeing responsiveness of service as more important. The US Environmental Protection Agency intends to push stricter standards, which the industry has opposed.COMMENT ON ARTICLE
* MINING WITH BACTERIA: The notion of extracting valuable minerals using bacterial leaching is not new, but as reported by an article from THE ECONOMIST ("Rocks On The Menu", 12 March 2011), the process is now seeing wider use.
Rock-eating bacteria such as Acidithiobacillus and Leptospirillum are found naturally in nasty, acidic environments. They obtain their energy through chemical reactions with sulfides, and so can boost the breakdown of valuable ores. Base metals such as iron, copper, zinc and cobalt occur widely as sulfides, and more valuable metals such as gold and uranium are also present in the same bodies of ore.
Bioleaching is not very effective for extracting large quantities of metals from rich ores, conventional smelting being more cost-effective in that case. Bioleaching is, however, useful for treating ores with low metal content; it is slow but cheap, and despite the nastiness of leaching pools, it's actually cleaner than smelting, since smelting materials with poisonous components such as arsenic produces nasty air pollutants.
Bioleaching has long been used to recover gold from ores that are hard to break down using "roasting" or heat treatment. The bacteria are introduced into huge stirred tanks or "bioreactors", containing ground-up rocks and dilute sulfuric acid. The bacteria transform one form of iron found within the ore -- ferrous iron -- to another -- ferric iron -- and consume the energy released. In acidic solutions, ferric iron is a powerful oxidizing agent, breaking down sulfide minerals and releasing associated metals. It's been hard to recover metals other than gold profitably in this way in the past, but now high commodity prices mean that bioleaching has become more attractive, and more diverse in application. For example:
The Finnish and Chilean ventures both use bioheaps, which are carefully laid-out hill-sized piles of ground-up ore, irrigated from above with dilute sulfuric acid laced with bacterial cultures, and aerated from below. Different recipes of microorganisms are used depending on the composition of the ore.
A Canadian firm named BacTech Mining Corporation, which sells a bioleaching process for gold extraction, has set up a new division to apply the technology to mining waste. At the town of Cobalt in Canada, it plans to remove toxic elements such as arsenic from old silver-mine tailings and extract cobalt, nickel and silver. The goal of the project is primarily environmental remediation, but valuable metals will be extracted as a secondary objective.
Metals can also be extracted from polluted water. Water sources near disused mines are often contaminated due to water draining through old waste dumps. In Germany, a firm called GEOS has set up a pilot plant on a coal-mining site to clean the iron-laced groundwater using bacteria. Organisms with slightly different dietary preferences, known as "sulfur-reducing bacteria", show promise for removing dissolved metals from liquid industrial waste. Paques, a firm based in the Netherlands, has a commercial process for recovering zinc using such bacteria, and British researchers have recently retrieved palladium and platinum in this way, too.
As part of a European project called ProMine, geologists are mapping Europe's mineral resources to kilometers deep in the Earth, in an effort to stimulate the mining industry and reduce dependence on imports. Integral to the project is further development of biological metal-recovery methods.
* In somewhat related news, geothermal plants obtain energy by tapping the Earth's heat, often by drawing up hot water from the depths to drive a steam turbine. That water is often loaded with mineral salts; as reported by THE NEW YORK TIMES, now a California startup company named Simbol Materials is planning to extract valuable metals from the water, including lithium, manganese and zinc. They've all been extracted from brines before, but the idea of hooking up an extraction plant to a geothermal facility seems to be new.
Simbol intends to add their system to an existing geothermal plant in California's Imperial Valley. Simbol officials say they have developed a proprietary filtering process that works fast and meshes neatly with the geothermal facility, simply adding a detour for brine being pumped through the geothermal plant. Costs are low enough to compete in the world metals market, and environmental impact is small since the geothermal plant is already in operation.COMMENT ON ARTICLE
* ANOTHER MONTH: For Halloween, I have to give a nod to the cleverest, if one of the geekiest, Halloween costumes for dogs ever made. Katie Mello of Portland, Oregon, rigged up Bones, her pet greyhound, as a STAR WARS "All-Terrain Armored Transport (AT-AT)", one of the elephantine walking war machines from the classic sci-fi movie THE EMPIRE STRIKES BACK. The detail's not too surprising since Mello works at a Portland animation studio named "Laika House" that does some model stop-action work. WIRED Online warned that Bones better watch out, however: "All it would take is one rogue kitten to tangle a ball of wool around his legs and Bones will be a heap on the floor."
* I've been paying more attention to THE ECONOMIST's online blog section, and one posting -- from "Prospero", the arts and culture commentator -- caught my eye. It began with a description of a food market in Milan, where items offered for sale included calves' feet, pig's tongue, pig's head, and so on; the Italians still have an appetite for parts of animals that end up as dog food in the UK and the USA. Prospero pointed out there was a certain imperative in "total body eating": "After taking an animal's life, the least you can do is use all of it." The idea is catching on a bit in places, but it's hardly a mass movement yet.
That brought back long-buried memories of how my mom used to feed us beef tongue every now and then. It had a funny sort of texture, tending towards the rubbery and chewy, in a good sort of way; taste was bland and untroubling, it went well with catsup. Of course, most beef and pork goes well with catsup as far as I'm concerned, an attitude that shocks some people, don't understand why. I also vaguely recall eating bull's testicles one time, cut into strips and, I think, batter fried as an appetizer.
The most unusual meat I ever ate was kangaroo. On a corporate business event a group of us went to a fancy little restaurant, and one of the women ordered a kangaroo steak. She gave me a bite of it and it was pleasant enough. However, kangaroo is extremely lean meat, all that can be done to cook it is sear it on both sides. Cook it above "rare" and apparently it's like trying to eat shoe leather.
* It had been sunny here is northeast Colorado through October, but weather report for the last Wednesday in the month said it would be snowing for the day. I figured it would be just the usual early-season light snowfall; it started snowing as predicted on Tuesday evening, and I went to bed thinking there would be a modest frosting of snow on the ground the next morning.
I got to suspecting I was underestimating the weather when the power went down in the dark hours of the morning; the electromechanical clock on my stove said it happened at 0320 AM. With nothing working in the house, I decided to sleep in until it was light. I never otherwise sleep in, and I must say it was nice for once not to get out of bed feeling grumpy. However, the landscape was a mess, covered with wet snow up to the boot-tops.
Not good -- given the warm Fall up to then, only a minority of the trees had lost their leaves. The two young ash trees in my back yard were bowed down and I went out to shake them off. A branch on one had partially cracked, so I took some cord and slinged it back upright against the tree. I won't know if I'll have to amputate until spring. Even if I do, I was luckier than others, there being tree branches down all over the neighborhood. I went for a walk and shook off branches from some of the neighbors' trees, getting frosted with snow in the process. I should have worn my rain jacket, but it was still sort of fun.
Lacking anything better to do, I spent the morning reading through my pile of books, lying dressed in sweats under a sleeping bag -- the heat was off. I think I'll see if I can find one a cookstove and stockpile a container of dry twigs, so I can at least warm up a can of chili the next time this happens.
I was thinking it would take 8 to 12 hours to get power back on, but it was on after just a bit under 8 hours, which I figured was good time. I've never been through a power failure that went into a second day; I was hoping this wouldn't be the time, I didn't want my freezer to thaw out: "Oh no! My ice cream bars will all melt!" Hmm, maybe I should get a little cheap styrofoam cooler, too, so I could load it up and set it out in the snow. As long as it's freezing out, I might as well take advantage of it. I made a point of emailing a thank-you to the city utility, and got an appreciative response. I had to think the utilities people were in a foul mood that morning.
The next day I went to the gym at the Loveland city center. The park in the middle of the center looked like it had been the scene of a riot, with parts of trees strewn over the grounds. Even by the weekend, there was still wreckage of trees all around, and when I drove past the city recycling center, not far from my house, there was a traffic jam of pickup trucks and the like loaded up with branches backed up to the arterial, an unprecedented sight. On Monday I drove past a city park and noticed the entrances to the parking lot were blocked off; on inspection of why, the lot was mostly occupied by a long hill of piled-up broken branches. I have rarely seen a more destructive snowstorm.
* Over the summer I somehow managed to pile up articles to write up for the blog -- to the extent that when I looked in my web browser at the list of hyperlinks I'd collected, it ran off the screen. I decided to focus on getting rid of them. I finally did so, though it took me about two months of concentration on the task. I threw out about half, but when I was done I still had over a hundred entries written for future blog postings. That was enough material for 20 weeks -- an absurdity, since articles can go stale if I don't post them in a timely fashion. The pileup in the output queue did give me added incentive to toss things off the list. Hopefully, I can stay ahead of the devil in the future. The odd thing is that I get down to the bottom of the barrel, I start feeling at loose ends. But that never lasts for long.COMMENT ON ARTICLE