Sunday, August 31, 2014

Japanese high school baseball game lasts 50 innings

A baseball game in Hyogo Prefecture, Japan that started Thursday took 50 innings and four days to determine a winner, which is amazing in its own way, but wait until you see the two starters' pitch counts.

Chukyo finally defeated Sotoku 3-0 in the 50th Sunday morning, but look at those pitching lines. Both starting pitchers played the entire game. Taiga Matsui hurled 709 pitches for Chukyo, and Sotoku's Jukiya Ishioka threw 689 pitches. ...

If the game had surpassed 54 innings, the winner would have been decided by a lottery, which would have sucked for the team that played 54 innings of baseball and didn't get picked.
--Samer Kalaf, Deadspin, on the need for a lottery way before 54 innings. HT: Joy of Sox

Can somebody become Chinese?

Try as I might, I just can't become Chinese.

It started as a thought experiment: I wondered what it would take for me, the son of Chinese immigrants, to become a citizen of China. So I called the nearest Chinese consulate and got lost in a voice mail maze with nobody at the end. The consulate's website explained the process for getting visas but not for naturalization.

Then I realized why it was so difficult to get an answer: Beijing doesn't ever expect to hear from foreigners who want to become Chinese citizens.

As it turns out, a naturalization procedure is found under China's Nationality Law. But precious few people pursue it: The 2000 Chinese census counted just 941 naturalized citizens.

But let's say that I decided to become fluent in Mandarin, brush up my knowledge of Chinese history and culture, move to China and live the rest of my life there. Even then, even with thousands of generations of Chinese genes behind me, I would still not be accepted as truly Chinese.

All this crystallized for me why, in this supposed age of a rising China and a declining U.S., we Americans should worry a bit less. No matter how huge China's GDP gets, the U.S. retains a deep, enduring competitive advantage: America makes Chinese Americans. China doesn't make American Chinese.

China also isn't particularly interested in making American Chinese. It isn't in China's operating system to welcome, integrate and empower immigrants to redefine the very meaning of Chinese-ness. That means that China lags behind the U.S. in a crucial 21st-century way: embracing diversity and making something great from many multicultural parts.
--Eric Liu, WSJ, on American exceptionalism

The limits of national assimilation

In the ancient mountains towering above this coastal town in northern Wales, where eight in 10 people speak the native Celtic tongue, and many carry names their fellow Britons would not dare pronounce, Welsh nationalists have their eyes firmly set on independence — Scottish independence.

Less than a month before Scotland holds a referendum on whether to leave Britain, Wales is watching with a mix of envy, excitement and trepidation.

“If Scotland votes yes, the genie is out of the bottle,” said Leanne Wood, leader of Wales’s nationalist party Plaid Cymru. Only one in 10 Welsh voters supports independence, compared with about four in 10 in Scotland, but Ms. Wood thinks that could change. “The tectonic plates of the United Kingdom are shifting,” she said. ...

Unlike Scotland, whose Parliament voted to join England three centuries ago, Wales was conquered in 1282. ...

Caernarfon Castle, up the street from Palas Print, was built by Edward I of England who killed Llewellyn, the last native prince of Wales, and declared his own firstborn son the Prince of Wales. That tradition still grates with some Welsh people. When Prince Charles was invested in Caernarfon Castle in 1969, militants tried to blow up his train.
--Katrin Bennhold, NYT, on when 732 years isn't enough

Wednesday, August 27, 2014

Winning the alphabet lottery: .tv and the nation of Tuvalu

Today, as video is watched on smartphones and laptops rather than on living room couches, the .tv suffix — owned, improbably, by the tiny South Pacific island nation of Tuvalu — has become for some companies a chance to signal that they are showing video the way people are increasingly used to seeing it. ...

The sudden prominence of .tv is the latest twist in one of the Internet’s more unusual tales. In the 1990s, the suffix .tv was assigned to Tuvalu (Britain received .uk; France, .fr; and so on). At the height of the Internet gold rush, in 1999, a start-up named DotTV paid Tuvalu $50 million over 12 years for the right to sell .tv to other companies. ...

In 2002, Verisign, a large manager of web addresses, acquired the company and still operates the .tv domain today. It agreed in 2011 to manage the .tv address through 2021, and the payments to Tuvalu’s government are said to be a couple million dollars a year.

Those dividend payments are an important revenue source for the country, which has a population of barely 10,000 who live on a tiny cluster of coral atolls and islands about halfway between Australia and Hawaii.

The economic success of Tuvalu and .tv has led other countries to try to leverage their domain names into a consistent revenue source: Montenegro, for example, has the extension. me that can offer a personal touch to a Web address; and Colombia’s .co has emerged as a logical, less expensive substitute for .com. ...

“I was once shocked when I saw someone using an alternative ending, I thought they were dooming themselves,” said Josh Bourne, a managing partner at FairWinds Partners, a consultant on domain names. “But I’ve changed my opinion,” he said, rattling off prominent examples like Ask.fm (fm for Micronesia) and Bit.ly (ly for Libya).

Tuesday, August 26, 2014

The calculated plan to export Korean pop culture

The so-called “Korean Wave” of pop culture that seems to have taken the world by storm was set in motion two decades ago. Known as “Hallyu” in Korean – it’s the most well-funded, highly-orchestrated national marketing campaign in the history of the world. The goal: to make Korea the world’s top exporter of pop culture. Korean pop culture exports have already gone from nearly zero, in the early 1990s, to $4.6 billion in revenue in 2012 (the most recent official year-end figures available...

The answer lies partly in the Asian Financial Crisis of 1997-1998, which left the country economically crippled, forcing the government to request a $57-billion loan from the IMF. The crisis exposed a huge fault line in the Korean economy: it was too dependent on the nation’s chaebols – megaconglomerates like Hyundai, Samsung, and LG, which hauled the economy up from sub-Saharan-African levels of poverty in the 50s and so became too big to fail.

The government of then-president Kim Dae-jung realised it had to diversify. According to Choi Bokeun, an official at Korea’s Ministry of Culture, Sport, and Tourism, Dae-jung marveled at how much revenue the United States brought in from films, and the UK from stage musicals. And he decided to use those two countries as benchmarks for creating a pop culture industry for Korea.

Was the president out of his mind? Building a pop culture export industry from scratch during a financial crisis seems like bringing a Frisbee instead of food to a desert island. But there was method to the madness. The creation of pop culture, Dae-jung argued, doesn’t require a massive infrastructure; all you really need is time and talent. Of course, that was easier said than done. In order to sell pop culture to the world, you first have to convince the world that your nation is cool.

Korea had no tradition of homegrown pop music bands. If the country wanted to achieve its pop culture export aspirations, it was going to have to use extreme methods. K-pop stars are groomed like Romanian gymnasts or Bolshoi ballerinas – picked out as children and groomed for years before they are permitted to perform in public, a process that results in 13-year music contracts. ...

It’s not just the record labels who are putting their noses to the grindstone. The Korean government runs and finances a fund of funds, managed by an entity called the Korean Venture Investment Corporation, with a staggering $1 billion earmarked solely to be invested into Korean pop culture. One government-funded lab is working on virtual reality and hyper-realistic hologram technology – not for the purpose of warfare or espionage but rather, to make a mind-blowing concert experience.
--Euny Hong, Newsweek, on manufacturing cool

European countries are poor by U.S. standards

If Britain were to join the United States, it would be the second-poorest state, behind Alabama and ahead of Mississippi.

The ranking, determined by Fraser Nelson, an editor of The Spectator magazine, was made by dividing the gross domestic product of each state by its population, and it took into account purchasing power parity for cost of living. Several other European countries were also included in the ranking.

--Hunter Schwarz, Washington Post, on putting things in economic perspective. HT: NK

Thursday, August 21, 2014

Buffett on how to invest if you're not Buffett

Vanguard got a huge boost this spring when Warren Buffett gave it a public stamp of approval in March.

The billionaire wrote in his closely watched letter to shareholders of his company, Berkshire Hathaway Inc., that he believed most people would be well-served by following the investing instructions in his will.

Mr. Buffett, 83 years old and with a net worth of $66 billion, wrote that he advised his trustee to "put 10% of the cash in short-term government bonds and 90% in a very low-cost S&P 500 index fund. (I suggest Vanguard's.)."

In the five months that followed, investors poured $5.5 billion into the Vanguard fund, or about three times more than during the same period the previous year.

Vanguard, based in Malvern, Pa., credits Mr. Buffett with the surge of money.
--Kirsten Grind, WSJ, on advice from an active investor

Most New Yorkers can't follow simple spoken directions

“I look at it much less scientifically,” said Susan Kellman, a longtime New York defense lawyer. Her favorite question is a simple listening-skills one: “Tell me one person who’s dead who we all know and respect.”

Most jurors say “my grandmother,” she said, and — unless that grandmother is Golda Meir — that tells her all she needs to know, she added.

“They don’t follow directions,” she said.
--Stephanie Clifford, NYT, on the transmission loss from mouth to brain

Monday, August 18, 2014

What happens when police wear body-mounted cameras

So it is in Rialto, Calif., where an entire police force is wearing so-called body-mounted cameras, no bigger than pagers, that record everything that transpires between officers and citizens. In the first year after the cameras' introduction, the use of force by officers declined 60%, and citizen complaints against police fell 88%. ...

What happens when police wear cameras isn't simply that tamper-proof recording devices provide an objective record of an encounter—though some of the reduction in complaints is apparently because of citizens declining to contest video evidence of their behavior—but a modification of the psychology of everyone involved. ...

In the U.K., where tests with them began in 2005, studies have shown that they aid in the prosecution of crimes, by providing additional, and uniquely compelling, evidence. In the U.S., in some instances they have shortened the amount of time required to investigate a shooting by police from two-to-three months to two-to-three days.

And they represent yet one more way we are being recorded by means that could eventually be leaked to the public.
--Christopher Mims, WSJ, on the effect of being watched

One black person in the jury pool reduces black defendant convictions by 16%

In a great paper, The Impact of Jury Race in Criminal Trials, Shamena Anwar, Patrick Bayer and Randi Hjalmarsson exploit random variation in the jury pool to estimate the effect of race on criminal trials. The authors have data from nearly 800 trials in two Florida counties. On any given day, a jury pool is randomly drawn from a master list based on driver’s licenses. On some days, the pool of about 30 people contains some black members and on other days, purely for random reasons, it does not. The voir dire process—removals, excuses and challenges—whittles down the jury pool to 6 jury members with typically 1 alternate.

The authors have data on the race, gender, and age of each member of the jury pool as well as each member of the ultimate jury. The authors also know the race and gender of the defendant and the charges. What the authors discover is that all white juries are 16% more likely to convict black defendants than white defendants but the presence of just a single black person in the jury pool equalizes conviction rates by race. The effect is large and remarkably it occurs even when the black person is not picked for the jury. The latter may not seem possible but the authors develop an elegant model of voir dire that shows how using up a veto on a black member of the pool shifts the characteristics of remaining pool members from which the lawyers must pick; that is, a diverse jury pool can make for a more “ideologically” balanced jury even when the jury is not racially balanced.
--Alex Tabarrok, Marginal Revolution, on the elusive goal of color-blindness

Dollar-cost averaging doesn't work

Bernstein Global Research recently conducted its own study of the subject, and was able to quantify some of the cost of investing gradually. Using the Standard & Poor’s 500-stock index and its predecessors, Bernstein examined the rolling one-year returns of the stock market through 12-month periods from the beginning of 1926 to the end of 2013 — a total of more than 1,000 such periods. It compared lump-sum investments made at the beginning of each period with stock purchases made through “dollar-cost averaging” — regular monthly investments in the S.&P. 500 for 12 months. Money on the sidelines stayed in three-month Treasury bills.

The firm found that the average one-year return was 12.2 percent for immediate investments into the stock index, 8.1 percent for the dollar-cost-averaging portfolios and 3.6 percent for the cash holdings. The penalty for investing gradually, in other words, was 4.1 percentage points. On the other hand, that gradual approach was 4.5 points better than just holding cash. ...

Bad timing happens by accident, though. If you had moved money into the stock market right before a major market peak, you would have been staring at big paper losses immediately.

How bad would those numbers have looked? For a concrete answer, I asked Mr. Masters to sift through his data and find the worst cases among the 12-month periods his firm analyzed.

It turns out that if you had held onto your stocks long enough, you would have come out whole — and much faster than I had expected.

For example, the worst 12-month period since 1926 began on July 1, 1931, during the Depression: The stock index lost 67.6 percent, including dividends, in those 12 months. Yet it would have taken only 39 months — 3.25 years — to erase all your losses, assuming that you had stayed in the market.

In recent decades, Bernstein found, the worst 12-month period began on March 1, 2008, when the market’s return was minus 43.3 percent. Many people bailed out of stocks then and never went back. But if you had stayed fully invested in the market, you would have recovered all of your losses within 22 months — and would be sitting on enormous gains today.

Sunday, August 17, 2014

Correction: Debreu was not so dishonorable

Writing too quickly last week about Finding Equilibrium: Arrow, Debreu, McKenzie and the Problem of Scientific Credit, by Till Düppe and E. Roy Weintraub (Princeton University Press, 2014), [Economic Principals] made a serious mistake, combining one misfortune with another and blaming both on Gérard Debreu, the Nobel laureate, who died in 2004.

Robert Anderson, of the University of California at Berkeley, wrote, “I was astonished to read the following statement in the August 11 [EP] article:”
Further details had emerged, including an astonishing fact: the anonymous referee, who bottled UP McKenzie’s submission to Econometrica for a critical time, while Arrow and Debreu tidied up their proof, was none other than Debreu himself; and Debreu hadn’t disclosed his conflict of interest to the editor, Robert Solow. Debreu’s conduct was thus revealed as having been dishonorable.
Anderson continued,
The chronology of the book makes it clear that it was [Leonid] Hurwicz and [John] Nash who held up the publication by failing to provide timely referee reports. The chronology states that Solow requested a referee report from Debreu on October 5, 1953; that on December 14, Strotz communicated to McKenzie that the reports were favorable and that Debreu submitted his final referee report recommending publication on December 17, 1953. That is fast refereeing by any standard. 
Furthermore, it indicates that on May 1, 1953, Solow noted the similarity between McKenzie’s paper and that of Arrow and Debreu. So Solow was fully aware of the situation when he asked Debreu to referee the paper. Debreu, however, did fail to notify Solow of the overlap between the Arrow-Debreu and McKenzie papers.
--David Warsh, Economic Principals, on an unfortunate smearing

Saturday, August 16, 2014

Americans don't understand health insurance

The comprehension study shows that people have a limited understanding of traditional health insurance. Only 14% of the sample was able to answer correctly 4 multiple choice questions about the four basic components of traditional health insurance design: deductibles, copays, coinsurance and maximum out of pocket costs (‘MOOP’). Similarly, many respondents were unable to calculate the cost of basic services covered by the traditional insurance plan. Most strikingly, only 11% were able to correctly answer a fill-in-the-blank question about the cost of a hospitalization. ...

[If] consumers don’t understand their own health insurance policies, it is unlikely that they will respond to the incentives embedded in those policies.
--Loewenstein et al., "Consumers' Misunderstanding of Health Insurance," on an over-engineered product

Tuesday, August 12, 2014

Why don't we require helmets for walking and driving?

Walking and driving are just as dangerous as biking — but they don't require helmets. ...

Back in the early 1990s, Australia collected good data on head injuries for walking, biking, and driving. (This was before the country imposed mandatory helmet laws for bikers.) And what they found was that biking was only slightly more dangerous than walking or driving:


...

Other data shows that despite increased voluntary helmet use by adults in the US and Great Britain, the overall number of cyclist fatalities hasn't been affected. ...

One can be blamed on drivers: perhaps for subconscious reasons, they seem to be less careful around helmeted cyclists. ...

Many people also suggest that wearing a helmet makes cyclists themselves less cautious in their riding, increasing the chance of an accident.

Why car theft is rare nowadays

Auto theft isn’t much of a problem anymore in New York City. In 1990, the city had 147,000 reported auto thefts, one for every 50 residents; last year, there were just 7,400, or one per 1,100. That’s a 96 percent drop in the rate of car theft.

So, why did this happen? All crime has fallen, nationally and especially in New York. But there has also been a big shift in the economics of auto theft: Stealing cars is harder than it used to be, less lucrative and more likely to land you in jail. As such, people have found other things to do.

The most important factor is a technological advance: engine immobilizer systems, adopted by manufacturers in the late 1990s and early 2000s. These make it essentially impossible to start a car without the ignition key, which contains a microchip uniquely programmed by the dealer to match the car. ...

You can see this in the pattern of thefts of America’s most stolen car, the Honda Accord. About 54,000 Accords were stolen in 2013, 84 percent of them from model years 1997 or earlier, according to data from the National Insurance Crime Bureau, a trade group for auto insurers and lenders. Not coincidentally, Accords started to be sold with immobilizers in the 1998 model year. The Honda Civic, America’s second-most stolen car, shows a similar pattern before and after it got immobilizer technology for model year 2001.

Old cars are easier to steal, and there are plenty of them still on the road. But there’s an obvious problem with stealing them: They’re not worth very much.
--Josh Barro, The Upshot, on criminals responding to incentives

Monday, August 11, 2014

Gerard Debreu's dishonorable path to the Nobel Prize

It had been intuitively obvious since Adam Smith that, in an economic system, everything depends on everything else, and thought possible, since Leon Walras, to calculate and measure such a system – in other words, to produce a blueprint to test against the world. But proving that such a coherence is possible in the first place in a world of many competing individuals was the necessary first step to describing it in detail. ...

For thirty years the official story of general equilibrium went like this: Kenneth Arrow and Gerard Debreu, working independently at first, then joining forces, proved that Adam Smith was right, and the rest is history. ...

It was some time in the 1970s that [economist and historian of economic thought Roy] Weintraub first became aware that [Lionel] McKenzie, by then of the University of Rochester, had in the early 1950s proved the same result as had Arrow and Debreu, and slightly earlier at that, but somehow had failed to share in the enormous credit assigned or their famous result. ...

Debreu arrived from Paris in 1950, deeply trained in Bourbakist mathematics and somewhat insulated from the emphasis on planning methods that dominated Cowles at the time. He and Arrow began working on the equilibrium proof separately; when learning of each other’s work, they threw their lots in together and presented their results at the 1952 meetings of the Econometric Society, in Chicago – a day after McKenzie had talked about his work. Their paper, “Existence of an Equilibrium for a Competitive Economy,” appeared in Econometrica eighteen months later, more general than that of McKenzie, but three months after his. ...

Not long after, McKenzie would begin one of the greatest second acts in twentieth-century economics. Unable to get a job at a top-five university (Princeton at least awarded him his Ph.D. on the basis of his journal articles), he signed on at the University of Rochester in 1957 on the strength of a promise that he could build a department. ... And so McKenzie did. ... From modest beginnings, Rochester went on to become one of the most successful training grounds for young economists in the world...

[Weintraub] noticed that the principals had been somewhat reluctant to discuss the details surrounding their respective proofs. He badgered them, gradually learned that Debreu had attended McKenzie’s session and hadn’t told Arrow about it. ...

Further details had emerged, including an astonishing fact: the anonymous referee, who bottled UP McKenzie’s submission to Econometrica for a critical time, while Arrow and Debreu tidied up their proof, was none other than Debreu himself; and Debreu hadn’t disclosed his conflict of interest to the editor, Robert Solow. Debreu’s conduct was thus revealed as having been dishonorable.
--David Warsh, Economic Principals, on a reminder to act honorably when refereeing papers "anonymously"


UPDATE: Debreu did not in fact delay his referee report.

Friday, August 8, 2014

Quantifying the advance in PED technology in muscle-building

For many, Schwarzenegger represents the alpha and omega of bodybuilding. He was the sport’s first genuine celebrity, its first crossover star, and still remains the tallest champion (at six-foot-two) in the history of the Olympia. ...

Texas native Ronnie Coleman, an eight-time Mr. Olympia who is arguably the greatest bodybuilder of all time, had a listed height of five-foot-10 but frequently took the competition stage at 295 pounds. Jay Cutler, Coleman’s immediate successor as Mr. Olympia, competed at an equally massive 280 pounds. Even at his peak, Schwarzenegger never exceeded a competition weight of 235 pounds. The physiques of modern bodybuilders were quite literally unattainable during the early days of the sport. ...

From the outset, Mr. Olympia participants benefited from one of the great discoveries of the 1950s: anabolic steroids. ... When 240-pound Lee Haney emerged as an unbeatable competitor in the early 1980s, it appeared that human development could go no further.

With his victory in the 1992 Mr. Olympia, English bodybuilder Dorian Yates changed all of that. Though only five-foot-nine, Yates competed at a lean 270 pounds through the combination of a maniacal training program with precise steroid usage that was stacked with growth hormone. GH proved to be a missing link in the chain that allowed athletes to reach unprecedented lean weights...
--Oliver Lee Bateman, The Atlantic, on Schwarzenegger as a girly-man among today's bodybuilders

Professional adult dodgeball

Dodgeball used to be a children's sport, like hopscotch and tag. But adult dodgeball participation has grown 20% annually over the past three years, according to the Sport and Social Industry Association.

The rules of the traditional game are familiar to anybody who didn't skip third-grade gym class: Hit someone with a ball, and he is out. Catch a ball and the thrower is out. The team whose players are eliminated loses. Simple in concept, it becomes far more difficult when you are playing against highly organized teams with scripted plays and predetermined strategies.

Adult dodgeball is played all sorts of ways. Mr. Marchbanks's team has won tournaments on trampolines, solid ground and mud. The team has played with foam balls and kickballs.

For many, it isn't a casual pastime. Mr. Marchbanks devotes about eight hours a week to practicing or playing the game, and four more hours to studying film of upcoming opponents. ...

Dodgeball, of course, isn't the only childhood pastime appealing to adults these days. The sixth annual World Thumb Wrestling Championship took place this month north of London, and the International Tree Climbing Championship was in Milwaukee. Two tournaments claim to grant the national kickball title. ...

Mr. Marchbanks and his squad—Team Doom—have won about $100,000 in cash prizes over the past two years. If the five members of Team Doom successfully defend their title at the Ultimate Dodgeball Championship starting Friday in Las Vegas, they'll add $20,000 to their winnings.
--Andrew Beaton, WSJ, on never having to grow up

Personal characteristics predict PTSD risk

Veterans who enlisted before graduating from high school were at especially high risk of developing chronic PTSD, as were those involved in intimate, close-combat killing. Previous studies have found the same.

Hispanics veterans were three times as likely as whites to develop the disorder, and blacks twice as likely. Those ethnic differences had turned up in other studies as well, though the gap was mostly explained by differences in education and combat exposure — minority soldiers and Marines generally had less education upon enlistment and saw more combat, compared to whites. The new report found that minority veterans were at high risk of developing chronic PTSD even after correcting for education and combat.
--Benedict Carey, NYT, on surprising correlates of trauma

Wednesday, August 6, 2014

Learning about your peers' savings choices can be demotivating

Peer information interventions involve disseminating information about what a target population’s peers typically do. By sharing this information, it may be possible to teach people that a certain behavior is more common than they had previously believed, motivating those people to engage in the behavior more themselves. This approach has been dubbed “social norms marketing” and is used at approximately half of U.S. colleges in an effort to reduce student alcohol consumption. ...

We conducted our experiment in partnership with a large manufacturing firm and its retirement savings plan administrator. ...

For the [non-saving] recipients, the two peer information mailings stated the fraction of employees in the relevant age bracket who were already enrolled in the savings plan. For the [low-saving] recipients, the two peer information mailings stated the fraction of savings plan participants in the relevant age bracket contributing at least 6% of their pay on a before-tax basis to the plan. ...

We find that among [non-saving] recipients with a 0% contribution rate default—those whom we expected to be most susceptible to our information treatment—receiving peer information significantly reduced the likelihood of subsequently enrolling in the plan from 9.9% to 6.3%, a decrease of approximately one-third. ...

We find that the oppositional reaction among [non-saving] recipients with a 0% default is concentrated among employees with low relative incomes. This result raises the possibility that information about peers’ savings choices discourages low-income employees by making their relative economic status more salient, reducing their motivation to increase their savings rates and generating an oppositional reaction.
--Beshears et al., "The Effect of Providing Peer Information on Retirement Savings Decisions," on when disseminating peer information backfires

Spinach is not a good source of iron: The urban legend behind the urban legend

Spinach is not an exceptional nutritional source of iron. The leafy green has iron, yes, but not much more than you’d find in other green vegetables. And the plant contains oxalic acid, which inhibits iron absorption.

Why, then, do so many people believe spinach boasts such high iron levels? Scholars committed to unmasking spinach’s myths have long offered a story of academic sloppiness. German chemists in the 1930s misplaced a decimal point, the story goes. They thus overestimated the plant’s iron content tenfold.

But this story, it turns out, is apocryphal. It’s another myth, perpetuated by academic sloppiness of another kind. The German scientists never existed. Nor did the decimal point error occur. At least, we have no evidence of either. Because, you see, although academics often see themselves as debunkers, in skewering one myth they may fall victim to another.
--Charlie Tyson, Inside Higher Ed, on urban legend upon urban legend. HT: Marginal Revolution

Gluten is probably not causing your stomach upset

In 2011, Peter Gibson, a professor of gastroenterology at Monash University and director of the GI Unit at The Alfred Hospital in Melbourne, Australia, published a study that found gluten, a protein found in grains like wheat, rye, and barley, to cause gastrointestinal distress in patients without celiac disease, an autoimmune disorder unequivocally triggered by gluten. Double-blinded, randomized, and placebo-controlled, the experiment was one of the strongest pieces of evidence to date that non-celiac gluten sensitivity (NCGS), more commonly known as gluten intolerance, is a genuine condition.

But like any meticulous scientist, Gibson wasn't satisfied with his first study. His research turned up no clues to what actually might be causing subjects' adverse reactions to gluten. ... He resolved to repeat the trial with a level of rigor lacking in most nutritional research. Subjects would be provided with every single meal for the duration of the trial. Any and all potential dietary triggers for gastrointestinal symptoms would be removed, including lactose (from milk products), certain preservatives like benzoates, propionate, sulfites, and nitrites, and fermentable, poorly absorbed short-chain carbohydrates, also known as FODMAPs. And last, but not least, nine days worth of urine and fecal matter would be collected. With this new study, Gibson wasn't messing around.

37 subjects took part, all confirmed not to have celiac disease but whose gastrointestinal symptoms improved on a gluten-free diet, thus fulfilling the diagnostic criteria for non-celiac gluten sensitivity.** They were first fed a diet low in FODMAPs for two weeks (baseline), then were given one of three diets for a week with either 16 grams per day of added gluten (high-gluten), 2 grams of gluten and 14 grams of whey protein isolate (low-gluten), or 16 grams of whey protein isolate (placebo). Each subject shuffled through every single diet so that they could serve as their own controls, and none ever knew what specific diet he or she was eating. After the main experiment, a second was conducted to ensure that the whey protein placebo was suitable. In this one, 22 of the original subjects shuffled through three different diets -- 16 grams of added gluten, 16 grams of added whey protein isolate, or the baseline diet -- for three days each.

Analyzing the data, Gibson found that each treatment diet, whether it included gluten or not, prompted subjects to report a worsening of gastrointestinal symptoms to similar degrees. Reported pain, bloating, nausea, and gas all increased over the baseline low-FODMAP diet. Even in the second experiment, when the placebo diet was identical to the baseline diet, subjects reported a worsening of symptoms! The data clearly indicated that a nocebo effect, the same reaction that prompts some people to get sick from wind turbines and wireless internet, was at work here. ... Gluten wasn't the culprit; the cause was likely psychological. Participants expected the diets to make them sick, and so they did. ...

Consider this: no underlying cause for gluten intolerance has yet been discovered.
--Ross Pomeroy, Real Clear Science, on the stomach irritant above the neck. HT: JFB. See also my earlier post on self-induced gluten intolerance

Monday, August 4, 2014

China's strategy towards its ethnic minorities

Those [55 ethnic] minorities number more than 100 million but as a group are all but invisible to the outside world, their situation complicated by the seeming paradox of being citizens of China without being part of the Chinese people. ...

It’s not hard to figure out why China’s ethnic minorities chafe under the domination of a government that combines Han chauvinism with ideological rigidity, and [British journalist] Eimer provides abundant detail. Even as the Han stream into their homelands, minority groups cannot be educated in their native languages or fully practice their religions, are relegated to menial jobs, and are ruthlessly repressed when they complain about such inequalities. ...

“We say China is a country vast in territory, rich in resources and large in population,” Mao Zedong said in a 1956 speech buried deep in the fifth volume of his selected works but cited by Mr. Eimer as a likely explanation for Chinese expansionism. “As a matter of fact, it is the Han nationality whose population is large and the minority nationalities whose territory is vast and whose resources are rich.” ...

But China always plays a long game, so when a Han informant says that “Uighurs are like pandas,” a species both coddled and endangered, that should bring to mind the fate of the Manchus, another minority discussed by Mr. Eimer. The Han way of dealing with the Manchus was “to make them Han” over time, so that their native language “became redundant, while their tribal culture and customs faded away until they were not more than a distant memory.”

Now, he reports, “it is estimated that barely 100 people can speak the language.” The once-mighty Manchus, who gave their name to Manchuria, are thus today “a people in name only who have been absorbed by the Han.” So, Tibetans and Uighurs beware.
--Larry Rohter, NYT, on becoming Han Chinese

Media preferences of today's kids

Some in the media business worried that the troubles at Nickelodeon were a warning sign that today’s digitally wired children would never grow into traditional television watchers. ...

Despite the concerns, children today are watching more television on a traditional television set than they did five years ago. Children ages 2 to 11 now spend an average of 111 hours, 47 minutes a month [about three and a half hours a day!] watching traditional television, according to Nielsen’s Cross-Platform Report for the first quarter of 2014.

That is up from the average of 108 hours, 45 minutes a month children in that age group spent watching traditional television in 2009. ...

After its ratings dropped, Nickelodeon overhauled its operations. The network researched the next generation of children, those born since 2005, who in addition to watching television grew up searching for funny cat videos on YouTube.

These youngsters love their families, want to save the world, are very well behaved, have few close friends but wide social circles and do not like bad language, meanness and bloody violence, the research showed, according to Ms. Zarghami. “Kids are looking for nuggets of funny,” she said.
--Emily Steel, NYT, on the eternal draw of television

Sunday, August 3, 2014

The lives of Times Square cartoon characters

Before stepping into the Times Square hurly-burly of Elmos, Minnie Mouses and Batmen who pose for photographs and then coax customers for tips, Mr. Rodríguez spent a week studying the competition. He analyzed tourist behavior. He calculated potential earnings. And in the absence of anyone masquerading as a certain Nickelodeon star, he spotted an opportunity.

Thus was born SpongeBob SquarePants Rodríguez.

On his first day he made $80 in five hours, a better rate — and more interesting work — than the series of temporary jobs he had held since immigrating to the United States from Ecuador in March. ...

In recent years, these costumed characters have become ubiquitous, replacing the more sordid denizens of decades past. To some critics, they are little more than colorfully attired panhandlers and a chronic nuisance at the Crossroads of the World. ...

Most of the performers are immigrants and many of them are undocumented...

Earnings can vary wildly, from lows of $30 for eight hours of work to highs of over $200. But the rates depend on numerous factors, including the day of the week (weekends generally top weekdays, though they also draw more performers) and the time of the year (as tourism peaks, so does business). ...

Many of the performers live in working-class neighborhoods in New Jersey, a significant cluster of them in the city of Passaic.

“Next door there are five Elmos,” said Miguel Lezama, a 27-year-old Mexican, as he stood in the kitchen of a small apartment in Passaic that he shares with two other immigrants. He pointed in another direction: “On that side, a Cookie Monster and a Minnie. In front, a Winnie-the-Pooh and a Minnie. Up on Main Avenue, there are lots more.” He paused. “I live with a Cookie Monster.”

At certain times of the day, he said, there might be a dozen street performers standing on a corner of Main Avenue with their bulky costume sacks waiting for a bus to take them to Midtown Manhattan. ...

A vaguely defined ecosystem seems to exist within the community, with subspecies divided by costume type: The Disney, Pixar and “Sesame Street” characters gravitate toward one another, and the superheroes hang out with other superheroes. ...

The cartoon characters blame the superheroes for ruining the community’s image. The undocumented immigrants say the American citizens, not worried about deportation, arrogantly flout the law. And the veterans blame the newcomers, calling them money-grubbing arrivistes with no respect for the trade.
--Kirk Semple, NYT, on the humans behind the hustle

Saturday, August 2, 2014

Do men want sex more than women? We didn't think so for most of history

In the 1600s, a man named James Mattock was expelled from the First Church of Boston. His crime? It wasn’t using lewd language or smiling on the sabbath or anything else that we might think the Puritans had disapproved of. Rather, James Mattock had refused to have sex with his wife for two years. Though Mattock’s community clearly saw his self-deprivation as improper, it is quite possible that they had his wife’s suffering in mind when they decided to shun him. The Puritans believed that sexual desire was a normal and natural part of human life for both men and women (as long as it was heterosexual and confined to marriage), but that women wanted and needed sex more than men. A man could choose to give up sex with relatively little trouble, but for a woman to be so deprived would be much more difficult for her.

Yet today, the idea that men are more interested in sex than women is so pervasive that it seems almost unremarkable. Whether it’s because of hormone levels or “human nature,” men just need to have sex, masturbate, and look at porn in a way that simply isn’t necessary for women, according to popular assumptions...

And yet for most of Western history, from ancient Greece to beginning of the nineteenth century, women were assumed to be the sex-crazed porn fiends of their day. In one ancient Greek myth, Zeus and Hera argue about whether men or women enjoy sex more. They ask the prophet Tiresias, whom Hera had once transformed into a woman, to settle the debate. He answers, “if sexual pleasure were divided into ten parts, only one part would go to the man, and and nine parts to the woman.” Later, women were considered to be temptresses who inherited their treachery from Eve. Their sexual passion was seen as a sign of their inferior morality, reason and intellect, and justified tight control by husbands and fathers. Men, who were not so consumed with lust and who had superior abilities of self-control, were the gender more naturally suited to holding positions of power and influence. ...

Early twentieth-century physician and psychologist Havelock Ellis may have been the first to document the ideological change that had recently taken place. In his 1903 work Studies in the Psychology of Sex, he cites a laundry list of ancient and modern historical sources ranging from Europe to Greece, the Middle East to China, all of nearly the same mind about women’s greater sexual desire. ...

The story of how this stereotype became reversed is not a simple one to trace, nor did it happen evenly and all at once. Historian Nancy Cott points to the rise of evangelical Protestantism as the catalyst of this change, at least in New England. Protestant ministers whose congregations were increasingly made up mainly of middle-class white women probably saw the wisdom in portraying their congregants as moral beings who were especially suited to answering the call of religion, rather than as besmirched seductresses whose fate was sealed in Eden. Women both welcomed this portrayal and helped to construct it. It was their avenue to a certain level of equality with men, and even superiority.

Friday, August 1, 2014

Where the "we use 10% of our brain" myth came from

In the early 19th century, a French neurophysiologist named Pierre Flourens conducted a series of innovative experiments. He successively removed larger and larger portions of brain tissue from a range of animals, including pigeons, chickens and frogs, and observed how their behavior was affected.

His findings were clear and reasonably consistent. “One can remove,” he wrote in 1824, “from the front, or the back, or the top or the side, a certain portion of the cerebral lobes, without destroying their function.” For mental faculties to work properly, it seemed, just a “small part of the lobe” sufficed.

Thus the foundation was laid for a popular myth: that we use only a small portion — 10 percent is the figure most often cited — of our brain. ...

But Flourens was wrong, in part because his methods for assessing mental capacity were crude and his animal subjects were poor models for human brain function. Today the neuroscience community uniformly rejects the notion, as it has for decades, that our brain’s potential is largely untapped.
--Gregory Hickok, NYT, on the genesis of the myth