Fran works six days a week in fast food, and yet she’s homeless: ‘It’s economic slavery’

Fran Marion and Bridget Hughes are leading voices in Stand Up Kansas City, part of the Fight for $15 movement that aims raise the minimum wage across the US

Once a customer has barked their order into the microphone at the Popeyes drive-thru on Prospect Avenue, Kansas City, the clock starts. Staff have a company-mandated 180 seconds to take the order, cook the order, bag the order and deliver it to the drive-thru window.

The restaurant is on short shift at the moment, which means it has about half the usual staff, so Fran Marion often has to do all those jobs herself. On the day we met, she estimates she processed 187 orders roughly one every two minutes. Those orders grossed about $950 for the company. Marion went home with $76.

Despite working six days a week, Marion, 37, a single mother of two, cant make ends meet on the $9.50 an hour she gets at Popeyes (no apostrophe founder Al Copeland joked he was too poor to afford one). A fast food worker for 22 years, Marion has almost always had a second job. Until recently, she had been working 9am-4pm at Popeyes, without a break, then crossing town to a janitorial job at Bartle Hall, the convention center, where she would work from 5pm- to 1.30am for $11 an hour. She didnt take breaks there either, although they were allowed.

Read more: https://www.theguardian.com/us-news/2017/aug/21/missouri-fast-food-workers-better-pay-popeyes-economics

Billionaire Bloomberg to fund $5m public health projects in 40 cities worldwide

Exclusive: Melbourne, Accra and Ulaanbaatar among cities to benefit from funding pledged by former New York mayor to tackle issues from air pollution to obesity

Michael Bloomberg, the billionaire bte noire of both the sugar industry and the tobacco industry, famously fought for a ban on the sale of large-sized colas and other sweet drinks when he was mayor of New York and lost. Although that is not how he sees it.

We actually won that battle, he says. I have always thought if we had not been stopped by the court, it would have died as an issue. Nobody would have known about it. But the fact that it kept coming back to the newspapers was a gift in disguise because people started to think, Holy God, maybe full-sugar drinks are bad for me.

So what happened was consumption of full-sugar drinks around the world has gone down dramatically. If we had won the thing, I think it would have been less.

Bloomberg did plenty more for public health while mayor of New York, including imposing one of the first bans on smoking in bars and restaurants in 2003. Since then he has widened his sphere of influence, funding successful campaigns through his philanthropic foundation for sugar taxes in Mexico and Philadelphia and for curbs on smoking all over the world.

Now, appointed last year as the World Health Organisations global ambassador for non-communicable diseases meaning anything that can harm or kill you that is not infectious the eighth richest person in the world, worth an estimated $47.5bn, is taking his philosophy and his cash to 40 cities around the globe.

His offer, taken up by about 40 cities so far and officially launched on Tuesday, is $5m in assistance from Bloomberg Philanthropies as well as technical support for cities that choose to focus on one of 10 healthy lifestyle issues, including curbing sugary drink consumption, air pollution, promoting exercise and and bans on smoking. They range from affluent Melbourne in Australia to Cali and Medellin in Colombia, Accra in Ghana, Ulaanbaatar in Mongolia, Khatmandu in Nepal and Kampala in Uganda.

National and state governments collect taxes, but it is city governments that make things happen. 50% of people currently live in cities and that is projected to rise to 70% in the next decade or so. Cities are where the rubber meets the road, Bloomberg told the Guardian. The problems are in the cities and the solutions are in the cities.

Bloomberg is upbeat, indomitable and an independent thinker. He made his money in global financial services and has been a Democrat, a Republican and an independent at various times. He says he believes the war on sugar and tobacco, of which his foundation must be seen as the main global financial backer, is being won.

In parts of the world, clearly yes, and particularly on smoking, he said. In Europe nobody would have thought people wouldnt insist on smoking in an Irish bar or pub or an Italian restaurant, but the smoking campaign has really worked, reducing consumption in all of western Europe, north and south America and even in China.

But there are places where poor people live and they are still smoking and really damaging their lungs and they are going to die young. It is up to us to keep the battle going. Sugar is a little bit less developed but still working.

His attention is on non-communicable diseases more broadly now that includes air pollution and road traffic accidents as well as cigarettes, alcohol and bad food. Cities in poor countries may argue that they have too many other problems to spend time on sugary drinks, but, says Bloomberg, poverty, ill-health and poor education are all interlinked.

It will be harder to get the public behind you because they less understand the damage being done to their own health. But thats the challenge. The cities where its easy have probably already addressed the issue, he said.

Michael
Michael Bloomberg and WHO director-general Dr Margaret Chan Photograph: Bloomberg Phlilantropies

Bloomberg would not suggest it is easy to make the sort of changes he has pushed for in all these years.

I dont remember anybody objecting to the smoking ban when we put it in, although a lot of people wanted to take my picture and a lot of people gave me one finger waves, he said. If there was an easy solution to a complex problem, we wouldnt have the problem. If you want to make things better, youre going to be doing things that are tough.

The cities that commit to the Partnership for Healthy Cities can choose between curbing sugary drink consumption, passing laws to make public places smoke-free or banning cigarette advertising, cutting salt in food, using cleaner fuels, encouraging cycling and walking, reducing speeding, increasing seatbelt and helmet use, curbing drink driving or carrying out a survey to collect data on the lifestyle risks the city population runs.

Cape Town in South Africa was one of the earliest cities to commit and will focus on reducing the intake of sugary drinks. Its mayor, Patricia de Lille, says they are facing an epidemic of type 2 diabetes, caused by obesity. Diabetes is a silent killer, she said. We dont have the luxury to work by trial and error. Unfortunately we have to get it right first time.

London has also said it wants to be involved, although which issue will be the focus has not yet been revealed. It is a city with which Bloomberg says he has a complex relationship his former wife is British and his daughters hold dual nationality. He has an honorary knighthood from the Queen. He also has an honour from the City of London that he intends one day to cash in.

I do have the right to drive sheep across London Bridge and before I die, I want to do it one day at rush hour, just to see what happens, he said.

Read more: https://www.theguardian.com/society/2017/may/16/billionaire-bloomberg-to-fund-5m-public-health-projects-in-40-cities-worldwide

Are smartphones really making our children sad?

US psychologist Jean Twenge, who has claimed that social media is having a malign affect on the young, answers critics who accuse her of crying wolf

Last week, the childrens commissioner, Anne Longfield, launched a campaign to help parents regulate internet and smartphone use at home. She suggested that the overconsumption of social media was a problem akin to that of junk-food diets. None of us, as parents, would want our children to eat junk food all the time double cheeseburger, chips, every day, every meal, she said. For those same reasons, we shouldnt want our children to do the same with their online time.

A few days later, former GCHQ spy agency chief Robert Hannigan responded to the campaign. The assumption that time online or in front of a screen is life wasted needs challenging. It is driven by fear, he said. The best thing we can do is to focus less on the time they spend on screens at home and more on the nature of the activity.

This exchange is just one more example of how childrens screentime has become an emotive, contested issue. Last December, more than 40 educationalists, psychologists and scientists signed a letter in the Guardian calling for action on childrens screen-based lifestyles. A few days later, another 40-odd academics described the fears as moral panic and said that any guidelines needed to build on evidence rather than scaremongering.

Faced with these conflicting expert views, how should concerned parents proceed? Into this maelstrom comes the American psychologist Jean Twenge, who has written a book entitled iGen: Why Todays Super-Connected Kids Are Growing Up Less Rebellious, More Tolerant, Less Happy and Completely Unprepared for Adulthood and What That Means for the Rest of Us.

If the books title didnt make her view clear enough, last weekend an excerpt was published in the American magazine the Atlantic with the emotive headline Have smartphones destroyed a generation? It quickly generated differing reactions that were played out on social media these could be broadly characterised as praise from parents and criticism from scientists. In a phone interview and follow-up emails, Twenge explained her conclusions about the downsides of the connected world for teens, and answered some of her critics.

The Atlantic excerpt from your book was headlined Have smartphones destroyed a generation? Is that an accurate reflection of what you think?
Well, keep in mind that I didnt write the headline. Its obviously much more nuanced than that.

So why did you write this book?
Ive been researching generations for a long time now, since I was an undergraduate, almost 25 years. The databases I draw from are large national surveys of high school and college students, and one of adults. In 2013-14 I started to see some really sudden changes and at first I thought maybe these were just blips, but the trends kept going.

Id never seen anything like it in all my years of looking at differences among generations. So I wondered what was going on.

What were these sudden changes for teens?
Loneliness and depressive symptoms started to go up, while happiness and life satisfaction started to go down. The other thing that I really noticed was the accelerated decline in seeing friends in person it falls off a cliff. Its an absolutely stunning pattern Id never seen anything like that. I really started to wonder, what is going on here? What happened around 2011-2012 [the survey data is a year or two behind] that would cause such sudden changes?

And you concluded these changes were being brought about by increased time spent online?
The high-school data detailed how much time teens spend online on social media and games and I noticed how that correlated with some of these indicators in terms of happiness, depression and so on.

I was curious not just what the correlations were between these screen activities, mental health and wellbeing, but what were the links with non-screen activities, like spending time with friends in person, playing sports, going to religious services, doing homework, all these other things that teens do?

And for happiness in particular, the pattern was so stark. Of the non-screen activities that were measured, they all correlated with greater happiness. All the screen activities correlated with lower happiness.

Youve called these post-millennials the iGeneration. What are their characteristics?
Im defining iGen as those born between 1995 and 2012 that latter date could change based on future data. Im reasonably certain about 1995, given the sudden changes in the trends. It also happens that 1995 was the year the internet was commercialised [Amazon launched that year, Yahoo in 1994 and Google in 1996], so if you were born in that year you have not known a time without the internet.

But the introduction of the smartphone, exemplified by the iPhone, which was launched in 2007, is key?
There are a lot of differences some are large, some are subtle, some are sudden and some had been building for a while but if I had to identify what really characterises them, the first influence is the smartphone.

iGen is the first generation to spend their entire adolescence with the smartphone. This has led to many ripple effects for their wellbeing, their social interactions and the way they think about the world.

Psychology
Psychology professor Jean Twenge. Photograph: Gregory Bull/AP

Why are you convinced they are unhappy because of social media, rather than it being a case of the unhappy kids being heavier users of social media?
That is very unlikely to be true because of very good research on that very question. There is one experiment and two longitudinal studies that show the arrow goes from social media to lower wellbeing and not the other way around. For example, an experiment where people
gave up Facebook for a week and had better wellbeing than those who had not.

The other thing to keep in mind is that if you are spending eight hours a day with a screen you have less time to spend interacting with friends and family in person and we know definitively from decades of research that spending time with other people is one of the keys to emotional wellbeing; if youre doing that less, thats a very bad sign.

A professor at Oxford University tweeted that your work is a non-systematic review of sloppy social science as a tool for lazy intergenerational shaming how do you respond?
It is odd to equate documenting teens mental health issues with intergenerational shaming. Im not shaming anyone and the data I analyse is from teens, not older people criticising them.

This comment is especially strange because this researchers best-known paper, about what he calls the Goldilocks theory, shows the same thing I find lower wellbeing after more hours of screen time. Were basically replicating each others research across two different countries, which is usually considered a good thing. So I am confused.

Your arguments also seem to have been drawn on by the conservative right as ammunition for claims that technology is leading to the moral degradation of the young. Are you comfortable about that?
My analyses look at what young people are saying about themselves and how they are feeling, so I dont think this idea of older people love to whine about the young is relevant. I didnt look at what older people have to say about young people. I looked at what young people are saying about their own experiences and their own lives, compared to young people 10, 20, or 30 years ago.

Nor is it fair or accurate to characterise this as youth-bashing. Teens are saying they are suffering and documenting that should help them, not hurt them. I wrote the book because I wanted to give a voice to iGen and their experiences, through the 11 million who filled out national surveys, to the 200 plus who answered open-ended questions for me, to the 23 I talked to for up to two hours. It had absolutely nothing to do with older people and their complaints about youth.

Many of us have a nagging feeling that social media is bad for our wellbeing, but we all suffer from a fear of missing out.
Teens feel that very intensely, which is one reason why they are so addicted to their phones. Yet, ironically, the teens who spend more time on social media are actually more likely to report feeling left out.

But is this confined to iGeners? One could go to a childs birthday party where the parents are glued to their smartphones and not talking to each other too.
It is important to consider that while this trend also affects adults, it is particularly worrisome for teens because their brain development is ongoing and adolescence is a crucial time for developing social skills.

You say teens might know the right emoji but in real life might not know the right facial expression.
There is very little research on that question. There is one study that looked at the effects of screens on social skills among 11- to 12-year-olds, half of whom used screens at their normal level and half went to a five-day screen-free camp.

Those who attended the camp improved their social skills reading emotions on faces was what they measured. That makes sense thats the social skill you would expect to suffer if you werent getting much in-person social interaction.

So is it up to regulators or parents to improve the situation? Leaving this problem for parents to fix is a big challenge.
Yes it is. I have three kids and my oldest is 10, but in her class about half have a phone, so many of them are on social media already. Parents have a tough job, because there are temptations on the screen constantly.

What advice would you give parents?
Put off getting your child a phone for as long as possible and, when you do, start with one that doesnt have internet access so they dont have the internet in their pocket all the time.

But when your child says, but all my friends have got one, how do you reply?
Maybe with my parents line If your friends all jumped in the lake, would you do it too? Although at that age the answer is usually yes, which I understand. But you can do social media on a desktop computer for a limited time each day. When we looked at the data, we found that an hour a day of electronic device use doesnt have any negative effects on mental health two hours a day or more is when you get the problems.

The majority of teens are on screens a lot more than that. So if they want to use Instagram, Snapchat or Facebook to keep up with their friends activities, they can do that from a desktop computer.

That sounds hard to enforce.
We need to be more understanding of the effects of smartphones. In many ways, parents are worried about the wrong things theyre worried about their kids driving and going out. They dont worry about their kids sitting by themselves in a room with their phone and they should.

Lots of social media features such as notifications or Snapchats Snapstreak feature are engineered to keep us glued to our phones. Should these types of features be outlawed?
Oh man. Parents can put an app [such as Kidslox or Screentime] on their kids phone to limit the amount of time they spend on it. Do that right away. In terms of the bigger solutions, I think thats above my pay grade to figure out.

Youve been accused by another psychologist of cherry-picking your data. Of ignoring, say, studies that suggest active social media use is associated with positive outcomes such as resilience. Did you collect data to fit a theory?
Its impossible to judge that claim she does not provide citations to these studies. I found a few studies finding no effects or positive effects, but they were all older, before smartphones were on the scene. She says in order to prove smartphones are responsible for these trends we need a large study randomly assigning teens to not use smartphones or use them. If we wait for this kind of study, we will wait for ever that type of study is just about impossible to conduct.

She concludes by saying: My suspicion is that the kids are gonna be OK. However, it is not OK that 50% more teens suffer from major depression now versus just six years ago and three times as many girls aged 12 to 14 take their own lives. It is not OK that more teens say that they are lonely and feel hopeless. It is not OK that teens arent seeing their friends in person as much. If we twiddle our thumbs waiting for the perfect experiment, we are taking a big risk and I for one am not willing to do that.

Are you expecting anyone from Silicon Valley to say: How can we help?
No, but what I think is interesting is many tech-connected people in Silicon Valley restrict their own childrens screen use, so they know. Theyre living off of it but they know its effects. It indicates that pointing out the effects of smartphones doesnt make you a luddite.

iGen: Why Todays Super-Connected Kids Are Growing Up Less Rebellious, More Tolerant, Less Happy and Completely Unprepared for Adulthood and What That Means for the Rest of Us by Jean Twenge is published by Simon & Schuster US ($27) on 22 August

Read more: https://www.theguardian.com/technology/2017/aug/13/are-smartphones-really-making-our-children-sad

Rule that patients must finish antibiotics course is wrong, study says

Experts suggest patients should stop taking the drugs when they feel better rather than completing their prescription

Telling patients to stop taking antibiotics when they feel better may be preferable to instructing them to finish the course, according to a group of experts who argue that the rule long embedded in the minds of doctors and the public is wrong and should be overturned.

Patients have traditionally been told that they must complete courses of antibiotics, the theory being that taking too few tablets will allow the bacteria causing their disease to mutate and become resistant to the drug.

But Martin Llewelyn, a professor in infectious diseases at Brighton and Sussex medical school, and colleagues claim that this is not the case. In an analysis in the British Medical Journal, the experts say the idea that stopping antibiotic treatment early encourages antibiotic resistance is not supported by evidence, while taking antibiotics for longer than necessary increases the risk of resistance.

There are some diseases where the bug can become resistant if the drugs are not taken for long enough. The most obvious example is tuberculosis, they say. But most of the bacteria that cause people to become ill are found on everybodys hands in the community, causing no harm, such as E coli and Staphylococcus aureus. People fall ill only when the bug gets into the bloodstream or the gut. The longer such bacteria are exposed to antibiotics, the more likely it is that resistance will develop.

The experts say there has been too little research into the ideal length of a course of antibiotics, which also varies from one individual to the next, depending in part on what antibiotics they have taken in the past.

In hospital, patients can be tested to work out when to stop the drugs. Outside hospital, where repeated testing may not be feasible, patients might be best advised to stop treatment when they feel better, they say. That, they add, is in direct contravention of World Health Organisation advice.

Other experts in infectious diseases backed the group. I have always thought it to be illogical to say that stopping antibiotic treatment early promotes the emergence of drug-resistant organisms, said Peter Openshaw, president of the British Society for Immunology.

This brief but authoritative review supports the idea that antibiotics may be used more sparingly, pointing out that the evidence for a long duration of therapy is, at best, tenuous. Far from being irresponsible, shortening the duration of a course of antibiotics might make antibiotic resistance less likely.

Alison Holmes, a professor of infectious diseases at Imperial College London, said a great British authority, Prof Harold Lambert, had made the same point in a Lancet article entitled Dont keep taking the tablets as early as 1999. It remains astonishing that apart from some specific infections and conditions, we still do not know more about the optimum duration of courses or indeed doses in many conditions, yet this dogma has been pervasive and persistent.

Jodi Lindsay, a professor of microbial pathogenesis at St Georges, University of London, said it was sensible advice. The evidence for completing the course is poor, and the length of the course of antibiotics has been estimated based on a fear of under-treating rather than any studies, she said. The evidence for shorter courses of antibiotics being equal to longer courses, in terms of cure or outcome, is generally good, although more studies would help and there are a few exceptions when longer courses are better for example, TB.

But the Royal College of GPs expressed concerns. Recommended courses of antibiotics are not random, said its chair, Prof Helen Stokes-Lampard. They are tailored to individual conditions and in many cases, courses are quite short for urinary tract infections, for example, three days is often enough to cure the infection.

We are concerned about the concept of patients stopping taking their medication midway through a course once they feel better, because improvement in symptoms does not necessarily mean the infection has been completely eradicated. Its important that patients have clear messages and the mantra to always take the full course of antibiotics is well known. Changing this will simply confuse people.

The UKs chief medical officer, Prof Dame Sally Davies, said: The message to the public remains the same: people should always follow the advice of healthcare professionals. To update policies, we need further research to inform them.

[The National Institute for Health and Care Excellence] is currently developing guidance for managing common infections, which will look at all available evidence on appropriate prescribing of antibiotics.

The Department of Health will continue to review the evidence on prescribing and drug-resistant infections, as we aim to continue the great progress we have made at home and abroad on this issue.

Read more: https://www.theguardian.com/society/2017/jul/26/rule-patients-must-finish-antibiotics-course-wrong-study-says

Journalist under fire for calling it ‘crazy’ not to be disgusted by homeless people

Prominent Mother Jones writer Kevin Drum says critics deliberately misreading his response to study on peoples reaction to seeing homelessness

A high-profile Mother Jones writer has suggested that it would be crazy not to have a reflexive disgust of homeless people, stirring the anger of those who say he is perpetuating the worst kinds of stereotypes.

Writing on Friday, Kevin Drum was responding to a study which found that some people with a propensity for feeling disgust might experience it when faced with someone living on the street.

Glenn Greenwald reacted by posting photographs of homeless people who have performed altruistic acts alongside a screen shot from Drums story. The two authors of the study, meanwhile, say Drum glossed over subtleties in their work.

outside in america

He seemed to just be endorsing the worst stereotypes without any nuance or without any humanization of these people, said Scott Clifford, one of the authors and an assistant professor of political science at the University of Houston.

Drum said his critics were guilty of deliberately misreading what I wrote.

The authors of the study which is admittedly eyebrow-raising owing to its lexicon set out to untangle a contradiction. Across the country, cities seek to aid homeless people by providing shelters and millions of dollars in funding, while also passing laws against sitting or lying on sidewalks, or restricting where RVs can park, which serve to exclude them.

They examined survey data and focused on a particular feeling that seemed to play a role in perpetuating this paradox: While most of the public wants to help homeless people, they write, sensitivity to disgust drives many of these same people to support policies that facilitate physical distance from homeless people.

Disgust, they propose, might help explain nimbyism in this casea desire among housed people to prevent camps or housing being built in the vicinity of their own homes. And they argue that the media exacerbates disgust with stories that mention disease and unsanitary conditions.

But they do not say that this kind of reaction of reaction is universal: while some people are prone to feeling disgust in the presence of homelessness, others are less likely to.

In his brief response to a summary that the authors published in the Washington Post, Drum said he found their results unsurprising. About half the homeless suffer from a mental illness and a third abuse either alcohol or drugs, he wrote, before commenting how crazy it would be not to not to be disgusted by a population like that.

He finished by suggesting that it was the work of a decent human being to overcome these reflexive feelings and find empathy.

It certainly is the work of of a good human being not to act fully based on immediate reactions, said Maria Foscarinis, executive director of the National Law Center on Homelessness & Poverty. She said the study seems to make sense, though she had some reservations. But she did not agree with Drum, calling the post really over the top and not true to what the paper is saying.

Its just a manifestation of the worst kinds of stereotypes. As a subscriber to this publication, Im really disappointed.

Pete White, head of the Los Angeles Community Action Network, said he thought Drums conclusions risked tarring an entire group of people, as if every houseless person is addicted to drugs and had a mental illness.

Both of the studys authors expressed displeasure. He appears to believe that everyone will in all circumstances feel disgust towards homeless people, said Spencer Piston, the other author and an assistant professor of political science at Boston University. Theres a clear irony here, which is that we argue that the connection between disgust and attitudes about the homeless depend in part on media coverage and the extent to which homeless people are portrayed as disgusting.

In an email, Drum said that he did not think his blogpost was unfaithful to the study. He also pushed back at those condemning him. Please note that I didnt say I was disgusted by the homeless, nor that they are inherently disgusting, he said. Only that, given the nature of the demographic, its not surprising that most people find them disgusting.

Clara Jeffery, the editor-in-chief of Mother Jones, said that the anger was fueled by the terms used in the study and not Drums writing itself. But it is one brief post about a study, she added in her email. Mother Jones has an extensive body of work on the homeless, the housing and mental health and opioid crisis fueling it.

Do you have an experience of homelessness to share with the Guardian? Get in touch

Read more: https://www.theguardian.com/us-news/2017/jul/17/homelessness-kevin-drum-mother-jones-disgust

How the middle class hoards wealth and opportunity for itself

American society is dominated by an elite 20% that ruthlessly protects its own interests

When I was growing up, my mother would sometimes threaten my brother and me with electrocution. Well, thats not quite right. In fact, the threat was of lessons in elocution, but we wittily, we thought renamed them.

Growing up in a very ordinary town just north of London and attending a very ordinary high school, one of our several linguistic atrocities was failing to pronounce the t in certain words. My mother, who was raised in rural north Wales and left school at 16, did not want us to find doors closed in a class-sensitive society simply because we didnt speak what is still called the Queens English. I will never forget the look on her face when I managed to say the word computer with neither a p nor a t.

Still, the lessons never materialised. Any lingering working-class traces in my own accent were wiped away by three disinfectant years at Oxford University. (My wife claims the adolescent accent resurfaces when I drink, but she doesnt know what shes talking about shes American.) We also had to learn how to waltz. My mother didnt want us to put a foot wrong there either.

In fact, we did just fine, in no small part because of the stable, loving home in which we were raised. But I have always been acutely sensitive to class distinctions and their role in perpetuating inequality. In fact, one of the reasons I came to the United States was to escape the cramped feeling of living in a nation still so dominated by class. I knew enough not to think I was moving to a socially mobile utopia: Id read some of the research. It has nonetheless come as something of a shock to discover that, in some important respects, the American class system is functioning more ruthlessly than the British one I escaped.

In the upper-middle-class America I now inhabit, I witness extraordinary efforts by parents to secure an elite future status for their children: tutors, coaches and weekend lessons in everything from French to fencing. But I have never heard any of my peers try to change the way their children speak. Perhaps this is simply because they know they are surrounded by other upper-middle-class kids, so there is nothing to worry about. Perhaps it is a regional thing.

But I think there is a better explanation. Americans tend to think their children will be judged by their accomplishments rather than their accents. Class position is earned, rather than simply expressed. The way to secure a higher status in a market meritocracy is by acquiring lots of merit and ensuring that our kids do, too. What ones parents are like is entirely a matter of luck, points out the philosopher Adam Swift. But he adds: What ones children are like is not. Children raised in upper-middle-class families do well in life. As a result, there is a lot of intergenerational stickiness at the top of the American income distribution more, in fact, than at the bottom with upper-middle-class status passed from one generation to the next.

Drawing class distinctions feels almost un-American. The nations self-image is of a classless society, one in which every individual is of equal moral worth, regardless of his or her economic status. This has been how the world sees the United States, too. Historian Alexis de Tocqueville observed in the early 19th century that Americans were seen to be more equal in fortune and intelligence more equally strong, in other words than they were in any other country, or were at any other time in recorded history. So different to the countries of old Europe, still weighed down by the legacies of feudalism.

British politicians have often felt the need to urge the creation of a classless society, looking to America for inspiration as, what historian David Cannadine once called it, the pioneering and prototypical classless society. European progressives have long looked enviously at social relations in the New World. George Orwell noted the lack of servile tradition in America; the German socialist Werner Sombart noticed that the bowing and scraping before the upper classes, which produces such an unpleasant impression in Europe, is completely unknown.

This is one of many reasons socialist politics struggled to take root in the United States. A key attraction of socialist systems the main one, according to Orwell is the eradication of class distinctions. There were few to eradicate in America. I am sure that one reason Downton Abbey and The Crown so delight American audiences is their depictions of an alien world of class-based status. One reason class distinctions are less obvious in America is that pretty much everyone defines themselves as a member of the same class: the one in the middle. Nine in ten adults select the label middle class, exactly the same proportion as in 1939, according to the pollsters Gallup. No wonder that politicians have always fallen over each other to be on their side.

But in recent decades Americans at the top of the ladder have been entrenching their class position. The convenient fiction that the middle class can stretch up that far has become a difficult one to sustain. As a result, the modifications upper or lower to the general middle class category have become more important.

Class is not just about money, though it is about that. The class gap can be seen from every angle: education, security, family, health, you name it. There will also be inequalities on each of these dimensions, of course. But inequality becomes class division when all these varied elements money, education, wealth, occupation cluster together so tightly that, in practice, almost any one of them will suffice for the purposes of class definition. Class division becomes class stratification when these advantages and thus status endure across generations. In fact, upper-middle-class status is passed down to the next generation more effectively than in the past, and in the United States more than in other countries.

One benefit of the multidimensional nature of this separation is that it has reduced interdisciplinary bickering over how to define class. While economists typically focus on categorisation by income and wealth, and sociologists tend more towards occupational status and education, and anthropologists are typically more interested in culture and norms, right now it doesnt really matter, because all the trends are going the same way.

It is not just the top 1% pulling away, but the top 20%. In fact, only a very small proportion of US adults 1% to 2% define themselves as upper class. A significant minority about one in seven adopts the upper middle class description. This is quite similar to the estimates of class size generated by most sociologists, who tend to define the upper middle class as one composed of professionals and managers, or around 15% to 20% of the working-age population.

As David Azerrad of the Heritage Foundation writes: There is little appetite in America for policies that significantly restrict the ability of parents to do all they can, within the bounds of the law, to give their children every advantage in life. That is certainly true. But then Azerrad has also mis-stated the problem. No one sensible is in favour of new policies that block parents from doing the best they can for their children. Even in France the suggestion floated by the former president, Franois Hollande, to restore equality by banning homework, on the grounds that parents differ in their ability and willingness to help out, was laughed out of court. But we should want to get rid of policies that allow parents to give their children an unfair advantage and in the process restrict the opportunities of others.

Most of us want to do our best for our children. Wanting ones childrens life to go well is part of what it means to love them, write philosophers Harry Brighouse and Adam Swift in their 2014 book Family Values: The Ethics of Parent-Child Relationships. But our natural preference for the welfare and prospects of our own children does not automatically eclipse other moral claims. We would look kindly on a father who helps his son get picked as starting pitcher for his school baseball team by practising with him every day after work. But we would probably feel differently about a father who secures the slot for his son by bribing the coach. Why? After all, each father has sacrificed something, time in one case, money in the other, to advance his child. The difference is team selection should be based on merit, not money. A principle of fairness is at stake.

So, where is the line drawn? The best philosophical treatment of this question I have found is the one by Swift and Brighouse. Their suggestion is that, while parents have every right to act in ways that will help their childrens lives go well, they do not have the right to confer on them a competitive advantage in other words, to ensure not just that they do well but that they do better than others. This is because, in a society with finite rewards, improving the situation of one child necessarily worsens that of another, at least in relative terms: Whatever parents do to confer competitive advantage is not neutral in its effects on other children it does not leave untouched, but rather is detrimental to, those other childrens prospects in the competition for jobs and associated rewards.

The trouble is that in the real world this seems like a distinction without a difference. What they call competitive advantage-conferring parental activities will almost always be also helping-your-kid-flourish parental activities. If I read bedtime stories to my son, he will develop a richer vocabulary and may learn to love reading and have a more interesting and fulfilling life. But it could also help him get better grades than his classmates, giving him a competitive advantage in college admissions. Swift and Brighouse suggest a parent should not even aim to give their child a competitive advantage: It would be a little odd, perhaps even a little creepy, if the ultimate aim of her endeavours were that her child is better off than others.

I think this is too harsh. In a society with a largely open, competitive labour market, it is not creepy to want your children to end up higher on the earnings ladder than others. Not only will this bring them a higher income, and all the accompanying choices and security, it is also likely to bring them safer and more interesting work. Relative position matters it is one reason, after all, that relative mobility is of such concern to policymakers. Although I think Brighouse and Swift go too far, they are on to something important with their distinction between the kind of parental behaviour that merely helps your own children and the kind that is detrimental to others. Thats what I call opportunity hoarding.

Opportunity hoarding does not result from the workings of a large machine but from the cumulative effect of individual choices and preferences. Taken in isolation, they may feel trivial: nudging your daughter into a better college with a legacy preference [giving applicants places on the basis of being related to alumni of the college]; helping the son of a professional contact to an internship; a single vote on a municipal council to retain low-density zoning restrictions. But, like many micro-preferences, to borrow a term from economist Thomas Schelling, they can have strong effects on overall culture and collective outcomes.

Over recent decades, institutions that once primarily served racist goals legacy admissions to keep out Jewish students, zoning laws to keep out black families have not been abandoned but have been softened, normalised and subtly re-purposed to help us sustain the upper-middle-class status. They remain, then, barriers to a more open, more genuinely competitive and fairer society. I wont insult your intelligence by pretending there are no costs here. By definition, reducing opportunity hoarding will mean some losses for the upper middle class.

But they will be small. Our neighbourhoods will be a little less upmarket but also less boring. Our kids will rub shoulders with some poorer kids in the school corridor. They might not squeak into an Ivy League college, and they may have to be content going to an excellent public university. But if we arent willing to entertain even these sacrifices, there is little hope. There will be some material costs, too. The big challenge is to equalise opportunities to acquire human capital and therefore increase the number of true competitors in the labour market. This will require, among other things, some increased public investment. Where will the money come from? It cant all come from the super-rich. Much of it will have to come from the upper middle class. From me andyou.

This is an extract from Dream Hoarders: How the American Upper Middle Class is Leaving Everyone Else in the Dust, Why That is a Problem, and What To Do About It by Richard V Reeves (Brookings Institution Press, 2017)

HOW TO STAY AHEAD – OR PLAY FAIR

As parents, we naturally want our children to flourish. But that laudable desire slides into opportunity hoarding when we use our money, power or position to give our own children exclusive access to certain goods or chances. The effect is to strengthen class barriers.

1. Fix an internship using our networks. Internships are becoming more important but are too often stitched up privately. Its worse if theyre unpaid. Instead: insist on paid internships, openly recruited.

2. Take our own kids to work for the day. Children learn what work is from adults. Instead: try bringing somebody elses kid to work, perhaps by partnering with local charities.

3. Be a Nimby. By shutting out low-income housing from our neighbourhoods with planning restrictions, we keep less affluent kids away from our local schools and communities. Instead: be a Yimby, vote and argue for more mixed housing in your area.

4. Write cheques to PTA funds. Many of us want to support the school our children attend. This tilts the playing field, however, since other schools cant do the same. Instead: get your PTA to give half the donations to a school in a poor area.

Read more: https://www.theguardian.com/inequality/2017/jul/15/how-us-middle-classes-hoard-opportunity-privilege

Meningitis vaccine may also cut risk of ‘untreatable’ gonorrhoea, study says

Bacteria causing two different illnesses belong to the same family and share much of the same genetic code providing unexpected cross protection

Hopes to fight untreatable strains of gonorrhoea have risen after it emerged that a new vaccine against meningitis unexpectedly reduced the risk of people getting the sexually transmitted infection.

Some strains of gonorrhoea are resistant to all available drugs, making vaccine development an urgent global health priority. But according to a study in The Lancet, a vaccine has offered protection against the sexually transmitted disease for the first time.

Gonorrhoea spreads through unprotected vaginal, oral or anal sex and many of those who contract the disease experience no symptoms. If left untreated, the disease can cause infertility and can increase the transmission of HIV infection.

A New Zealand meningitis epidemic in the early 2000s prompted the mass vaccination of a million people and fortuitously set the scene for the current study. The vaccine used, known as MeNZB, was designed to protect against meningococcal group B infection the cause of the most deadly form of meningitis.

But intriguingly, over the next few years, scientists noticed fewer gonorrhoea cases than expected in those who had been vaccinated against meningitis.

Dr Helen Petousis-Harris, a vaccine specialist from the University of Auckland who led the study, was optimistic: Some types of gonorrhoea are now resistant to every antibiotic we have, and there appeared [to be] little we could do to prevent the steady march of gonorrhoea to superbug status. But now theres hope, she added.

The research team studied over 14,000 people aged 15-30 whod been diagnosed with gonorrhoea at sexual health clinics across New Zealand and who had been eligible for the MeNZB vaccine during the emergency vaccination programme. They found vaccinated individuals were over 30% less likely to develop gonorrhoea.

Despite meningitis and gonorrhoea being very different illnesses, both are caused by bacteria from the same family and share much of the same genetic code, providing a possible explanation for the cross-protection that the team observed.

More than 78 million people worldwide get gonorrhoea each year with most infections in men and women under the age of 25. It is the second most common bacterial sexually transmitted infection in the UK after chlamydia. In England alone, almost 35,000 people were affected in 2014.

British Association for Sexual Health and HIVs President, Dr Elizabeth Carlin, who was not involved in the study, was more sceptical: These early findings are to be welcomed but its important to keep in perspective that the vaccine offered only moderate protection …. an individual receiving this vaccine remains susceptible to gonorrhoea but just less so than if unvaccinated.

The MeNZB vaccine used in the current study is no longer manufactured, but Petousis-Harris has high hopes for a similar meningitis vaccine called 4CMenB, available in many countries.

Petousis-Harris was clear about what needed to happen next. We need an urgent assessment of current meningitis vaccines to see if they protect against gonorrhoea. It may be possible to eliminate many gonorrhoea infections using a vaccine with only moderate protection. It does not need to be perfect, she added.

Read more: https://www.theguardian.com/science/2017/jul/10/meningitis-vaccine-may-also-cut-risk-of-untreatable-gonorrhoea-study-says

People taking heartburn drugs could have higher risk of death, study claims

Research suggests people on proton pump inhibitors are more likely to die than those taking different antacid or none at all

Millions of people taking common heartburn and indigestion medications could be at an increased risk of death, research suggests.

The drugs, known as proton pump inhibitors (PPIs), neutralise the acid in the stomach and are widely prescribed, with low doses also available without prescription from pharmacies. In the UK, doctors issue more than 50m prescriptions for PPIs every year.

Now researchers say the drugs can increase risk of death, both compared with taking a different type of acid suppressant and not taking any at all.

We saw a small excess risk of dying that could be attributed to the PPI drug, and the risk increased the longer they took them, said Ziyad Al-Aly, an epidemiologist from the University of Washington and co-author of the study.

The team say the study suggests those who take the drugs without needing to could be most at risk. They urged people taking PPIs to check whether this was necessary.

Previous research has raised a range of concerns about PPIs, including links to kidney disease, pneumonia, more hip fractures and higher rates of infection with C difficile, a superbug that can cause life-threatening sepsis, particularly in elderly people in hospitals.

But the latest study is the first to show that PPIs can increase the chance of death. Published in the journal BMJ Open, it examined the medical records of 3.5 million middle-aged Americans covered by the US veterans healthcare system.

The researchers followed 350,000 participants for more than five years and compared those prescribed PPIs to a group receiving a different type of acid suppressant known as an H2 blocker. They also took into account factors such as the participants age, sex and conditions ranging from high blood pressure to HIV.

The results show that those who took PPIs could face a 25% higher risk of death than those who took the H2 blocker.

In patients on [H2 blocker] tablets, there were 3.3 deaths per 100 people over one year. In the PPI group, this figure was higher at 4.7 per 100 people per year, said Al-Aly.

The team also reported that the risk of death for those taking PPIs was 15% higher than those taking no PPIs, and 23% higher than for those taking no acid suppressants at all.

Similar levels of increased risk were seen among people who used PPIs but had no gastrointestinal conditions, a result which the authors speculated might be driving the higher risk seen overall.

Gareth Corbett, a gastroenterologist from Addenbrookes hospital in Cambridge who was not involved with the study, cautioned against panic, pointing out that in most cases the benefits of PPI far outweighed any risk. What was more, he said, while the increased risk sounded high, it was still very low for each person.

PPIs are very effective medicines, proven to save lives and reduce the need for surgery in patients with bleeding gastric and duodenal ulcers and several other conditions, he said.

The studys authors said it was important that PPIs were used only when necessary and stopped when no longer needed.

Corbett agreed that many people take PPIs unnecessarily. They could get rid of their heartburn by making lifestyle changes, such as losing weight and cutting back on alcohol, caffeine and spicy foods, he said.

The authors said the study was observational, meaning it did not show that PPIs were the cause of the increased risk of death, and that it was unclear how the drugs would act to affect mortality. They said the drugs could affect components within cells, known as lysosomes, that help break down waste material, or shortening protective regions at the end of chromosomes, known as telomeres.

Aly said people on PPIs should check with their GP whether the drugs were still needed, adding: In some cases we expect that PPIs can be safely stopped, particularly in patients who have been taking them for a long time.

Read more: https://www.theguardian.com/science/2017/jul/04/people-taking-heartburn-drugs-could-have-higher-risk-of-death-study-claims

Brain game: how quitting routine tasks can help you learn new tricks

Daniel Glaser explains the benefits of taking on new challenges in middle age

Although his previous attempt at a career break, by becoming an apprentice shoemaker in Florence, didnt last long, it seems Daniel Day-Lewis is serious about retiring this time.

Maybe hes looking for a newchallenge. As we get older, work can feel more routine andeasy, which is born out in terms of brain activity.

Scans show tasks we are practised at often use less energy than novel activities we tend to do them more efficiently, and the mental energy required decreases. Were all familiar with this as our careers advance.

We also get more skilled at spotting our mistakes and rectifying them; as an old hand, you can notice when the edge has gone but you have enough tricks in the bag to make amends. This neuroprotective effect may be behind some of the results that show an apparent delay in symptoms of age-related cognitive decline for those more active in middle age. In this light a preemptive move, like Day-Lewiss, may be more sensible as we become over familiar with what we do.

It is perhaps typical of this most uncompromising of actors that hes quitting while ahead.

Dr Daniel Glaser is director of Science Gallery at Kings College London

Read more: https://www.theguardian.com/lifeandstyle/2017/jul/02/brain-game-quitting-routine-tasks-to-learn-new-tricks

In Seattle US old-timers rediscover the high life on cannabis tours

Retirement home residents take a trip to a producer

Forget bingo, tea dances and seaside trips. Residents from a chain of Seattle retirement homes are going on Pot for Beginners tours to learn about and buy cannabis in the city, where its now legal.

Connie Schick said her son roared with laughter when he heard she was joining a field trip to a cannabis-growing operation, an extraction plant and shop. The 79-year-old, who smoked the odd joint in the 70s, wanted to know how legalisation has changed the way the drug is used and produced.

Schick was one of eight women, from their late 60s to mid-80s, who descended from a minibus emblazoned with the name of their assisted living centre, El Dorado West, outside Vela cannabis store last Tuesday.

You can only play so many games of bingo, said Schick. My son thought it was hilarious that I was coming here, but Im open-minded and want to stay informed. Cannabis has come so far from the days when you smoked a sly joint and got into trouble if they found out. We used to call it hemp then and didnt know its strength. It just used to make me sleepy, so I didnt see the point.

Schick, who uses a wheelchair after suffering a stroke, is interested in the therapeutic effects of cannabis. Its so different now. There are so many ways you can take it, and all these different types to help with aches and pains.

They used to say it was a gateway drug to other things, like cocaine Lots of peoples views are changing.

Certainly, the number of people aged 65 or older taking cannabis in the US is growing. The proportion of this age group who reported cannabis use in the past year rose more than tenfold from 0.2% to 2.1% between 2002 and 2014, according to the National Survey on Drug Use and Health. A Gallup poll last year showed that 3% of those over 65 smoke cannabis.

Much of this is attributed to the ageing of the baby-boomer generation, who dabbled with the drug when they were young and are returning to it for medical or recreational use as it becomes legal and more normalised. Cannabis is now legal for medical use in 29 states and for medical and recreational use in eight (since 2012 in Seattle and the rest of Washington state).

Most of the women on the tour were more interested in the medical use, although Denise Roux, 67, said: I would like to buy it to get high too but Im a cheap high, it doesnt take much.

A seminar over sandwiches was held for thegroup as they sat in front of the large windows of the cultivation room, where they could see scores of plants growing under intense lighting.

They were told about the different strains: uplifting sativa plants and more sedating indicas. They learned about tetrahydrocannabinol (THC), which gives a high, and cannabidiol (CBD) which does not, making CBD-rich cannabis appealing for medical use. A scientist in a lab coat who worked in the processing facility spoke about terpenes fragrant oils secreted by glands in the flower that give strains their different smells and flavours. Vials were sniffed and various ways to take cannabis were also covered, including smoking, vaporising and eating it.

Roux, a retired administrative assistant, said: Im a big Google girl, but I wanted to talk to people who know about it so I can understand it all better. I have an autoimmune disease, which stops my appetite, and Im interested in marijuana from that standpoint. She added she had used cannabis recreationally in the 80s and had returned to it to help with her illness. I use a vape. It makes me sleepy and its a pain control, and it gives me an appetite.

After the briefing, it was time for shopping. The store looked like an upmarket jewellers, with muted lighting and art on the walls, except the glass cabinets in the store were stocked with pre-rolled joints, edibles including chocolates and sweets, vape pens and bags of different strains of cannabis rather than diamond rings and necklaces.

Darlene Johnson, 85, a former nurse, perused their contents. On the advice of a bearded bud tender, she bought a deep tissue and joint gel and a tincture to put in drinks, which she hopes will help with her severe neck pain. I wanted a non-psychoactive option, she said. I dont want to get high. I used to work in the emergency room and saw people come in sick from taking too many drugs, though not usually marijuana.

Her friend, Nancy Mitchell, 80, has never tried cannabis. She has MS and had read that cannabis could help with her symptoms. I wanted to know more details, she said. My kids keep telling me, Mom, try it. I dont want to smoke things, but I see there are other ways.

Smoking is not allowed at El Dorado West. Village Concepts, which runs the chain, has a no-smoking policy and it is illegal to consume cannabis in public in the state.

The chains director of corporate development, Tracy Willis, said: There was one man who was smoking it on his patio and he refused to stop, so he had to leave. If youre using an edible, we dont have any issue with it, thats your own business. We treat it as a recreational thing.

The tours began in response to questions from residents.They wanted to know where it was sold, how much money was made from it, where it was grown, said Willis. Weve had a good reaction [to the tours] from nine out of 10 relatives, but some are horrified. One angry daughter said we were encouraging marijuana use. Her mother told her to butt out.

Participants
Participants on the tour learned about different ways to use cannabis. Photograph: Jason Redmond/Reuters

Read more: https://www.theguardian.com/society/2017/jul/01/seattle-retirement-home-cannabis-tours