Monthly Archives: October 2014

Newsletter – October 2014

===========================================================

THE CENTER FOR THIRD AGE NEWSLETTER – OCTOBER 2014

1.  FATHER WILLIAM’S MUSINGS

2.  THE CASE FOR LOW IDEALS

3.  BEWILDERMENT

4.  NO NURSING HOME FOR ME!

5.  GRATEFULNESS AND THE POWER OF RADICAL SUFFICIENCY

6.  THE DIFFERENCE BETWEEN EDUCATION AND TRAINING

7. PRACTICING THE ART OF POSITIVE AGING

8.  SEVEN WAYS TO EXERCISE YOUR BRAIN – AND WHY YOU REALLY NEED TO

9.  THIS MONTH’S LINKS

===========================================================

QUOTES OF THE MONTH – BROOKS/MONTAIGNE, HANGER, NELSON & REMEN

“If others examined themselves attentively, as I do, they would find themselves, as I do, full of inanity and nonsense. Get rid of it I cannot without getting rid of myself.”

“If you’re not bewildered, you’re not paying attention.
Perhaps bewilderment is all there is.”

“Grateful living invites a radical experience of sufficiency, and sufficiency invites us toward using our lives and resources in more radically generous, open-hearted, and conscientious ways.”

“I AM ENOUGH.”

===========================================================

FATHER WILLIAM’S MUSINGS

October greetings, Dear Friends…

Hope you are delighting in spring with us ‘down-under’ in New Zealand or reminiscing in autumn ‘up-over’ in the northern hemisphere; whichever may be the case, the unfolding seasons always provide a provocative context for reflection on life.

These Musings grew organically out of the writings I collected for you this month. This has become the pattern I’ve grown into in my fourth quarter of life, and it very much brings to mind Elder Ed’s “Relaxing Into Participation” (RIP) that has helped me make so much sense out of living. By RIP, Ed, at 97, simply means that “surrendering our attempts to control life will let ‘Our Sources’ give us all the guidance we need to enjoy its flow.” (Rather than ‘Sources,’ you may prefer ‘Life,’ ‘Spirit,’ ‘Intuition,’ etc.; please substitute whatever works best for you.) Gestalt therapist Barry Stevens’ book, Dont Push the River; It Flows by Itself, is a great way to learn more about Ed’s RIP.

When a particular article strikes me, I put it in my newsletter file, and in the last week of the month, I read the collected pieces another time, winnow them to a few and then ask myself, “Why am I guided to keep these?” This week the four “QUOTATIONS OF THE MONTH” provide most of the answer…

Brooks and Montaigne’s finding “themselves, as I do, full of inanity and nonsense” is certainly the story of my maturing process. While it’s now clear I’ve always been, and still am, “full of inanity and nonsense,” I refused to accept that reality with any consistency until I began this 8th decade of mine. Now I remember daily that getting “rid of it I cannot without getting rid of myself.”

A particular form of “inanity and nonsense” I’ve always insisted on, and still do much of the time, is that “I know what’s going on.” I might have entertained the notion that I was “bewildered” maybe once a decade, but certainly not more frequently than that. As a consultant, I was paid to “know what’s going on,” especially in the psyches of executives and managers of large corporations. Actually I did tell many stories about my own colossal screw-ups in my life, but that was only when I was well past the emotional reality of being completely “bewildered” for some time. What Howard Hanger is talking about is being able to know you have no idea what’s going on at the time you have no idea what’s going on, that is, acknowledging you’re “bewildered” while you are out of control. I, and many other men as well, learned early on that “real men” are always in control and “bewilderment” is a disease of weaker species. The older I get, like Howard, the more I’m comfortable with his suggestions that “Perhaps bewilderment is all there is.”

The notion that personal “inanities and nonsense” and “bewilderment” might be the nature of humanity and reality can seem a bit somber. That is why the last two quotations from Kristin Nelson and Rachel Naomi Remen are so necessary (I don’t think it’s coincidental both are women).

Kristin’s piece on GRATEFULNESS AND THE POWER OF RADICAL SUFFICIENCY tells my story of how learning to practice gratitude has changed me for so much the better. “Grateful living invites a radical experience of sufficiency” describes perfectly what practicing gratitude has helped me do. For whatever reasons, even though I’ve always lived a very secure, upper middle class life, I have always been unconsciously afraid of not being or doing or having enough. This shows up in the silliest ways, such as ordering another glass wine before I’ve finished the one I have. Since coming across gratitude.org, whenever I start feeling “sufficiency-deprived,” I start looking at all the good things around me and saying, “Thank you for being part of my life.” Almost immediately “radical sufficiency” shifts my perception and my feelings so I live my life “in more radically generous, open-hearted, and conscientious ways.”

And then comes the wonderful Rachel Naomi Remen, 24 years a teacher of medical professionals, with a story of how her students identify their most important memory from the course. “The most common thing that students say in this sharing is a simple three-word phrase: I AM ENOUGH. Year after year it is the same phrase I myself say as well. It is the beginning of everything.”

And that is what I’ve found practicing gratitude has done for me, too. I now know “I AM ENOUGH” – and this knowing of personal sufficiency is “the beginning of everything.”

But what is so powerful about gratitude is how much it gives and how little it asks! Whenever you notice you’re off-center in the slightest, just relax (Elder Ed would say RIP), look around and say “Thank you” to the little things that make your life so easy.* As Meister Eckhart said, “If the only prayer you ever say in your entire life is thank you, it will be enough.”

*If you’d like one of the best examples I’ve seen of practicing gratitude, visit this link and see how much fun you can have doing it, too…

       http://momastery.com/blog/2014/08/11/give-liberty-give-debt/

Much love, FW

PS: Because some of the articles are long, I’m trying to give you a glimpse of why I find them valuable for me in an FW NOTE at the beginning of each – and remember, this is meant to be a month’s worth relaxed reading…

www.FatherWilliam.org

===========================================================

2.
THE CASE FOR LOW IDEALS

     BY DAVID BROOKS, THE NEW YORK TIMES, OCTOBER 16, 2014

FW NOTE: As a conservative columnist writing for The New York Times, David Brooks seems an oxymoron. In this piece David focuses not on politicians but on we who are so judgmental of them, and his mature realism recommends we try a version of ‘grateful living in the political realm

Let’s say you came of political age during Barack Obama’s 2008 campaign. Maybe you were swept up in the idealism. But now you’ve seen an election driven by hope give way to an election driven by fear. Partisans are afraid the other side might win. Candidates are pawns of the consultants because they’re afraid of themselves. Everybody’s afraid of the Ebola virus, ISIS and the fragile economy.

The politics of the last few years have made you disappointed, disillusioned and cynical. You look back at your earlier idealism as cotton candy.

Well, I’m here to make the case for political idealism.

I’m not making the case for the high idealism that surrounded that 2008 campaign. It was based on the idea that people are basically innocent and differences can be quickly transcended. It was based on the idea that society is easily malleable and it’s possible to have quick transformational change. It was based in the idea of a heroic savior (remember those “Hope” posters).

I’m here to make the case for low idealism. The low idealist rejects the politics of innocence. The low idealist recoils from any movement that promises “new beginnings,” tries to offer transcendent “bliss to be alive” moments or tries to fill people’s spiritual voids.

Low idealism begins with a sturdy and accurate view of human nature. We’re all a bit self-centered, self-interested and inclined to think we are nobler than we are. Montaigne wrote, “If others examined themselves attentively, as I do, they would find themselves, as I do, full of inanity and nonsense. Get rid of it I cannot without getting rid of myself.”

Low idealism continues with a realistic view of politics. Politics is slow drilling through hard boards. It is a series of messy compromises. The core functions of government are negative — putting out fires, arresting criminals, settling disputes — and much of what government does is the unromantic work of preventing bad situations from getting worse.

Politicians operate in a recalcitrant medium with incomplete information, bad options and no sleep. Government in good times is merely dull; when it is enthralling, times are usually bad.

So low idealism starts with a tone of sympathy. Anybody who works in this realm deserves compassion and gentle regard. The low idealist knows that rallies with anthems and roaring are just make-believe, but has warm affection for any politician who exhibits neighborliness, courtesy and the ability to listen. The low idealist understands that those who try to rise above the messy business of deal-making often turn into zealots and wind up sinking below it. On the other hand, this kind of idealist has a full heart for those who serve the practical work of legislating: James Baker and Ted Kennedy in the old days; Bob Corker and Ron Wyden today. Believing experience is the best mode of education, he favors the competent old hand to the naïve outsider.

The low idealist is more romantic about the past than about the future. Though governing is hard, there are some miracles of human creation that have been handed down to us. These include, first and foremost, the American Constitution, but also the institutions that function pretty well, like the Congressional Budget Office and the Federal Reserve. Her first job is to work with existing materials, magnify what’s best and incrementally reform what is worst.

The businessman might be enamored of disruptive change, but the low idealist abhors it in politics. The low idealist liked Obama’s vow to hit foreign policy singles and doubles day by day, so long as there is a large vision to give long-term direction.

The low idealist admires a different kind of leader; not the martyr or the passionate crusader or the righteous populist. He likes the resilient one, who maybe has been tainted by scandals and has learned from his self-inflicted wounds that his own worst enemy is himself.

He likes the person who speaks only after paying minute attention to the way things really are, and whose proposals are grounded in the low stability of the truth.

The low idealist lives most of her life at a deeper dimension than the realm of the political. She believes, as Samuel Johnson put it, that “The happiness of society depends on virtue” — not primarily material conditions. But, and this is what makes her an idealist, she believes that better laws can nurture virtue. Statecraft is soulcraft. Good tax policies can arouse energy and enterprise. Good social programs can encourage compassion and community service.

Low idealism starts with a warts-and-all mentality, but holds that people can be improved by their political relationships, so it ends up with something loftier and more inspiring that those faux idealists who think human beings are not a problem and politics is a mostly a matter of moving money around.

http://www.nytimes.com/2014/10/17/opinion/david-brooks-the-case-for-low-ideals.html?_r=0

===========================================================

3.  BEWILDERMENT

     BY HOWARD HANGER, OCTOBER 20, 2014

FW NOTE: In this Mental Breather Howard offers another version of grateful living as he suggests “bewilderment just might lead you into the cathedral of jaw-dropping awe

The world is a mind-boggling place. No doubt about it. Simply the fact that it exists at all can blow the cerebellum and fry the thalamus. But, to acknowledge the world as mind-boggling is just the beginning of boggledom.

According to current research, for example, there are approximately 8.7 million species of life on this planet. Of these 8.7 million, only 1.2 million species have been catalogued which leaves some 86% of earth life and 91% of marine life still awaiting discovery or description.

That brain of yours which might well be getting boggled at this moment, is mind-boggling all by itself: How it stores some memories and ditches others. How it is even now trying to make heads or tails of these words, wondering if you can hold your hunger pains till dinner-time, trying to figure how you’ll pay that VISA bill and still managing to ignite some excitement about the hot date you have scheduled this weekend.

Then, there’s your body: everything from your ear hair to your toe jam is worthy of a good boggle.   You ability to swallow, digest, fart and poop without even thinking about it is near the top of the boggle list.

Then there are relationships – lover, friend, family, co-workers, pet. Mind-boggling, all.   And government? Mindless-boggling, perhaps. Then there’s literature and Irish dancing, space capsules and on-line dating, Lady Gaga, Mona Lisa and Beethoven’s 9th.

A T-shirt might read: If you’re not bewildered, you’re not paying attention. Perhaps bewilderment is all there is. But rather than it throwing you into confusion and despair, if treated rightly, bewilderment just might lead you into the cathedral of jaw-dropping awe. And there’s nothing a little awe and wonder to add the “worth” to the “while” of this life.

http://www.contacthoward.com/

===========================================================

4. 
NO NURSING HOME FOR ME

     MAKING THE ROUNDS ANONYMOUSLY

FW NOTE: My friend Mike W forwarded this to me, and, after all the serious stuff, its time for a lighten up break thanks, Mike!

With the average cost for a nursing home reaching $188.00 per day, there is a better way to spend our savings, when we get old and feeble.

I have already checked on reservations at the Holiday Inn for a combined long term stay discount and a senior discount. It comes to only $49.23 per night. That leaves $138.77 a day for:Breakfast, lunch and dinner in any restaurant I want, or room service.

1.  Laundry, gratuities and special TV movies. Plus, they provide a swimming pool, a workout room, a lounge, washer, dryer,etc. Most have free toothpaste and razors, and all have free shampoo and soap.

2.  They treat you like a customer, not a patient. $5 worth of tips a day will have the entire staff scrambling to help you.

3.  There is city Bus stop out front, and seniors ride free. The Handicap bus will also pick you up (if you fake a decent limp).

4.  To meet other nice people, call a Church bus on Sundays. For a change of scenery, take the Airport shuttle Bus and eat at one of the nice restaurants there. While you’re at the airport, fly somewhere. Otherwise the cash keeps building up.

5.  It takes months to get into decent nursing homes. Holiday Inn will take your reservation today. And – you are not stuck in one place forever, you can move from Inn to Inn, or even from city to city. Want to see Hawaii? They have a Holiday Inn there too.

6.  TV broken? Light bulbs need changing? Need a mattress replaced? No problem. They fix everything, and apologize for the inconvenience.

7.  The Inn has a night security person and daily room service. The maid checks to see if you are OK. If not, they will call the undertaker or an ambulance. If you fall and break a hip, Medicare will pay for the hip, and Holiday Inn will upgrade you to a suite for the rest of your life.

8.  And no worries about visits from family. They will always be glad to find you, and will probably check in for a few days mini-vacation. The grandkids can use the pool.


What more can you ask for?

So . . .

When I reach the Golden age, I’ll face it with a grin– Just forward all my email to: me@Holiday_Inn!

http://www.netfunny.com/rhf/jokes/03/Dec/home.html

===========================================================

5. 
GRATEFULNESS AND THE POWER OF RADICAL SUFFICIENCY

     BY KRISTI NELSON, THE NEW YORK TIMES, OCTOBER 16, 2014

FW NOTE: I thank Kristi Nelson and Gratitude.org for this piece. Of all the varied psychological attempts to become more the relaxed, aware and loving person I mean to be, none has been more effective, simple and practical than grateful living

Grateful living, or living in touch with the great fullness of life, has the ability to significantly and positively alter our lives and the larger world in which we live.

Grateful living asks us to purposefully direct our awareness to notice all that is already fully present and abundant in our lives – from the tiniest things of beauty to the grandest of our blessings – and in so doing, to take nothing for granted. Grateful living as a practice powerfully affirms that we can be in charge of our attention, and can point it towards that which serves the fullness of our learning, our lives, our relationships, and the world. And, amazingly, every single moment can offer us this opportunity…not a single moment need escape our gratefulness…even if it is simply to learn from that which is most difficult. We have the choice to be in touch with the “fullness” of everything.

In infinite ways, grateful living offers an unparalleled pathway to the experience of “enough,” and even more than enough, in our lives. Suddenly, the barren corners of our homes are rich with things for which to be thankful. What seemed lacking in our relationship now feels full to overflowing. Our bodies are miraculous. Electricity itself blows our minds. Our days can be one discovery after another of blessing and opportunity. And the earth can seem an endless cacophony of beauty.

When we are in touch with enough-ness, when we feel like we are and have enough, we become less susceptible to cultural norms of complaint, envy, scarcity, comparison, and insatiability; all sources of suffering, and separation from ourselves, each other and the planet, and also the ways that we get caught in the “more is better” mentality. When we are so busy unconsciously rushing towards more, as Soul of Money author Lynne Twist says, we rush right over/past “enough” and do not even notice it…like an inconvenient speed-bump.

In this way, grateful living is an antidote to scarcity and insatiability. And it is radical because it establishes the only real, lasting conditions for generosity, kindness, compassion and the impulse to serve. When we are awake to all that is enough in our lives, we can turn our attention beyond ourselves. We need to feel our fullness in order to have anything truly meaningful to offer the world.

And since scarcity and insatiability are the drivers for so much that is unsustainable and unjust in our world right now, grateful living can be seen as not merely a salve of complacency and self-satisfaction, but as a protective impulse that wakes us up to act on behalf of the things for which we feel grateful. In this, gratefulness has the power to awaken us to greater purpose to preserve and tend the things we notice are worth cherishing – all the fragile blessing that surrounds us and is charged to our care.

Grateful living invites a radical experience of sufficiency, and sufficiency invites us toward using our lives and resources in more radically generous, open-hearted, and conscientious ways. This truth offers me hope – for our lives, each other and the world. And hope is a longing and blessing for which we can all be deeply grateful…

http://www.gratefulness.org/readings/gratefulness_sufficiency.htm

http://www.gratefulness.org/gratefulnews/index.htm

===========================================================

6. 
THE DIFFERENCE BETWEEN EDUCATION & TRAINING

     BY RACHEL NAOMI REMEN, AWAKIN.ORG, OCTOBER 13, 2014

FW NOTE: Remen makes a telling distinction between training and education from her 24 years of teaching medical professionals where her dream of medicine was to become a friend to life

For me, the process of education is intimately related to the process of healing. The root word of education — educare — means to lead forth a hidden wholeness in another person. A genuine education fosters self-knowledge, self-trust, creativity and the full expression of one’s unique identity. It gives people the courage to be more. Yet over the years so many health professionals have told me that they feel personally wounded by their experience of professional school and profoundly diminished by it. This was my experience as well.

It has made me wonder. Perhaps what we have all experienced is not an education at all but a training, which is something quite different. Certainly in medicine the training dimension of schooling has become more and more central and assumed a greater importance as the many techniques of the scientific approach have been developed. The goal of a training is competence and replicability. Uniqueness is often discouraged and may even be viewed as dangerous.

A training is all about the right way and the wrong way to do everything. In a training your own way of doing something can often become irrelevant. In such a milieu students often experience their learning as a constant struggle to be good enough. Training creates a culture of relentless evaluation and judgment. In response students try to become someone different than who they are.

At the end of the Healer’s Art teachings, the students stand in a large circle, silently review their memories of the course and identify the most important thing that they learned or remembered during the course. They then turn this insight into an affirmation: a little phrase which begins in one of three ways: I am … I can … or I will. One at a time, the students go around the circle each saying their phrase out loud. This year will be the 24th year that I have taught the course at my medical school. The most common thing that students say in this sharing is a simple three-word phase: I AM ENOUGH. Year after year it is the same phrase I myself say as well. It is the beginning of everything.

In Medicine, training is essential to technical competence. The real question is, is training good enough?

…My dream of medicine was not to become competent. My dream was to become a friend to life. It was that dream that enabled me to endure the relentless pursuit of competency required of me. But competence did not fulfill me then and could not have fulfilled me for my medical lifetime. Only a dream can do that.

http://www.awakin.org/read/view.php?tid=1052

===========================================================

7. 
PRACTICING THE ART OF POSITIVE AGING

     BY WILLIAM A. SADLER, PH.D.

FW NOTE: Bill Sadler, author of The Third Age and a founder of The Center for Third Age Leadership, offers a new form of organization to support positive aging: The North Oakland Village and The Village to Village Network

One of the most significant developments of the twentieth century was the unexpected, unprecedented extension of human life. In 1900 the average life span was just over 47 years; by 2000 it was over 77 years. That’s a 30-year life bonus. Many people are living even longer lives. The fastest growing cohort in America is centenarians. In 1965 there were just three thousand; but today there are nearly 60,000, and by 2050 there will be a million or more. Most of us over fifty have a bright, long future ahead of us. So the question and challenge for us is: what do we want to put this future?

Whether we want it or not, a very large part of this future will be an experience of aging. Until recently that was not considered a very bright option. But during the past thirty years an emerging science of aging has been discovering possibilities and options not previously on our radar screen. In the older, usual view aging was mostly defined by “ D” words like decline, degeneration, disability, dementia, and disease. When I started my research the reigning view of aging was “disengagement.” People were expected to withdraw and start shutting down. During the past couple of decades we are starting to view aging much more positively, because we know that there are different ways to age. Those “D” words that supposedly defined aging just aren’t normal; they are still common but not inevitable. Many people are beginning to experience vital, healthy, purposeful, and happy outcomes as they grow older. The former negative view is gradually being balanced by “positive aging.”

How can we do that? I believe that especially in the second half, our lives are shaped by the choices we make, not just or primarily by our genes. Genetic influences cannot begin to explain the rapid extension of human life that has occurred. If we are to age positively, we need to take charge of our lives and to design a life style that creates a future we want. That is a major lesson I have learned during the 25 years that I have been tracking individuals who have been creatively designing their lives after 50. Instead of closing down and going into decline, they have been moving to new peaks, tapping their creative potential to discover new options. Instead of declining and shutting down they have been changing course to experience a fulfilling future.

How can people do this? That’s has been a driving question in my research and in the many exciting, exemplary studies and books about positive aging that have recently been published. To move in a new direction into positive aging is a creative venture, like art. The art of positive aging describes a positive development, which we are beginning to witness not just in America, but in many countries and different cultures. It gives us an option that is very different from usual aging, which is mostly negative. Like serious art, the art of positive aging is hard work, creative, filled with discovery and learning, stimulating and fun, and which leaves a legacy in the lives of those we touch. That possibility presents us with an opportunity and a challenge – to shape the life we really want in the years ahead.

North Oakland Village encourages members to discover, learn, practice, and sustain the art of positive aging as they age in place in their own homes. It also aims to build a community that fosters and supports positive aging for seniors now and in the years to come.

http://www.northoaklandvillage.org/content.aspx?page_id=22&club_id=190478&module_id=110963

http://www.vtvnetwork.org/

===========================================================

8. 
SEVEN WAYS TO EXERCISE YOUR BRAIN – AND WHY YOU REALLY NEED TO

     BY MICHELE ROSENTHAL, SYNDICATED FROM REWIREME.COM, OCTOBER 20, 2014

FW NOTE: Michele makes a sobering point: “Cognitive brain function peaks in our early fifties, but staying mentally active can prevent brain loss in the years to follow and she suggests some very simple ways to “exercise your brain

If you’re over 40, you’re not going to like this (and if you’re not yet 40, get ready for a reality check): Early in your fifth decade, researchers believe, your cognitive brain performance peaks. From there, it’s a downhill slide for the remaining years of your life. The good news is that the brain is highly adaptable; it responds to experiences. In particular, “spaced practice” (repetitive exercise) helps the brain learn, grow, strengthen, and develop. As we age there are ways to combat the reduced function of such mental processes as memory, speed of thinking, problem solving, reasoning, and decision making. Starting to incorporate easy exercises today can help forestall decline tomorrow.

CAN THE BRAIN REVERSE THE AGING PROCESS?

According to Dr. Sandra Bond Chapman, founder and chief director of the Center for BrainHealth, and Dee Wyly, Distinguished University Chair at UT Dallas, “The world’s aging population is growing disproportionately. Our expected lifespan has reached an all-time high of more than 78 years, yet previous research shows cognitive decline may begin in the early 40s…. Until recently, cognitive decline in healthy adults was viewed as an inevitable consequence of aging. [Our] research shows that neuroplasticity can be harnessed to enhance brain performance and provides hope for individuals to improve their own mental capacity and cognitive brain health by habitually exercising higher-order thinking strategies no matter their age.”

The finding that global brain blood flow can be increased with complex mental activity suggests that staying mentally active helps reverse and potentially prevent brain losses and cognitive decline with aging.

The research Bond references was conducted at the Center for BrainHealth at the University of Texas at Dallas and published online by Cerebral Cortex. Researchers studied brain changes (using three MRI-based measurements) in a random sampling of people ages 56 to 71. What they discovered is exciting: Over a 12-week period, participants in hour-long sessions of directed brain training exhibited an expanded ability to create structural connection between parts of the brain related to learning and greater information communication across critical brain regions.

Dr. Sina Aslan, founder and president of Advance MRI and collaborator on the study, adds and explains, “Through this research we are able to see that cognitive training increases brain blood flow, which is a sensitive physiological marker of brain health. Previous research shows brain blood flow decreases in people beginning in their 20s. The finding that global brain blood flow can be increased with complex mental activity, as this study demonstrates, suggests that staying mentally active helps reverse and potentially prevent brain losses and cognitive decline with aging.”

In fact, the study shows a more than 8% increase in brain blood flow, which significantly impacts cognitive performance and can help your brain stay young. A followup study a year later confirmed that the gains were maintained. That’s good news if you want to boost your mental muscle!

Right about now you may be wondering what you’d have to do in order to reap these benefits. With the stress of an already packed schedule, do you have time to add yet another item to the calendar? Actually, training your brain is incredibly simple and can be done while moving through the tasks of your day.

7 SCIENTIFICALLY PROVEN, RESULTS-ORIENTED EXERCISES

Your brain is responsible for five main cognitive functions: executive function, memory, attention, language, and visual-spatial skills. If you already squeeze aerobic exercise into your schedule (studies recommend at least three times per week for an hour), then you have a good routine that’s increasing brain blood flow to critical memory centers and improving your ability to remember facts. Adding any of the following cognitive function–building practices will amplify your brain health benefits:

1.  STRATEGIZE – Logic and reasoning skills are the basis for making decisions and considering possible outcomes of your actions. The more you challenge yourself to do these kinds of tasks, the more you deepen the neural pathways necessary for this type of brain function. If you like games, this kind of exercise is right up your alley. Video games and strategic board games (such as chess) are great ways to engage this aspect of brain training. Other options include social interaction or any activity that requires you to identify a desired outcome and then calculate choices and develop a plan to achieve success.

2.  CHALLENGE YOUR MEMORY – You highlight how important memory is to your cognitive function every time you read, reason, or do any type of mental calculation. Memory is also the first place you’ll probably notice your cognitive function faltering. Training your memory is incredibly easy and can be done while you commute or listen to the radio: Commit to learning all the lyrics of a song while you’re driving, or memorize a poem while sitting on the bus. Don’t commute? Force yourself to do a task by memory. For example, wash your face and brush your teeth with your eyes closed, or learn to perform a task with your nondominant hand.

3.  (RE)FOCUS YOUR ATTENTION – Attention is one of the foundational elements of cognition and it decreases with age. Your ability to place your focus (and hold it there), however, allows you to concentrate and be productive despite distractions, which means this is a part of your brain function you want to keep sharp. Increasing this brain ability is as simple as changing your routine. Ninety-eight percent of what you do every day is habit; changing the routine guarantees your brain has to pay attention. There are two ways to work this part of your brain muscle: (1) Identify what you do by rote day after day and change it. That can mean taking a different route to work or school or changing your exercise routine (i.e., do the exercises in reverse order); (2) When you combine activities that require cognitive function, you force your brain to do more in the same amount of time. For example, cook and listen to talk radio or an audiobook, or drive while making a list of groceries in your head.

4.  RESET YOUR BRAIN – As important as it is to be able to pay attention, sometimes it’s even better to give your brain a break. Stilling your mind breaks its rhythm, which causes it to refresh. Giving your mind a break allows it to return to tasks later with increased perspective and creativity. You can think of this as a sort of interval training for your brain. Dr. Chapman suggests a “Five by Five” principle “where you take a break from whatever you’re doing five times a day for at least five minutes to reset.”  Make an effort to process information beyond its superficial level. When you read a book or article (including this one!), share what you learn with someone else. Rather than just recounting the facts, identify and discuss the theme(s) in what you read and how they relate to your life.

5.  BUFF UP YOUR LINGO – Language games stimulate your brain to understand, remember, and recognize words. The more you practice fluency in language, the more quickly your brain will retrieve old words and embrace new ones. Taking the time to understand new words in context especially trains your brain to remember them, since you increase the associations linked with the definition. A simple way to engage this process is to read articles outside your normal realm of interest. Rather than reading the business section of the newspaper, read the sports or science section instead.’

6.  SYNTHESIZE, SYNTHESIZE, SYNTHESIZE – According to Keir Bloomer, chair of the Higher Order Skills Excellence Group, “synthesis is the skill of joining up. Essentially, it is the process of forming new knowledge or new ideas by taking different existing ideas and knowledge, sometimes from different areas…. it’s a skill that involves activities like linking, connecting, joining together.” To exercise yourself in this way, make an effort to process information beyond its superficial level. When you read a book or article (including this one!), share what you learn with someone else. Rather than just recounting the facts, identify and discuss the theme(s) in what you read and how they relate to your life.

7.  TAKE A REALLY GOOD LOOK – One of the most dominant senses your brain uses to understand and encode your experience is your visual sense. Being able to visually analyze your environment gives you many cognitive clues about how to behave within it. Developing this part of your brain muscle can be done in two easy ways: (1) In any setting, pick out three items and their location. When you leave the setting, close your eyes and see if you can accurately remember each item and its location; do this again two hours later; (2) For more of a challenge, try noticing everything you can see in your full range of vision (front and peripheral), then write it all down from recall.


Considered in these micro-elements, the ease of adding brain exercise to your day seems obvious. I think you can handle it, so I’m going to sneak in one more surefire way to bump up your gray matter: STOP MULTI-TASKING. Constant simultaneous in/output fatigues your brain and leads to reduced efficiency and productivity. When you need to focus on higher-order thinking (those tasks that really require full access to your brain power), you’ll achieve more if you allow your focus to remain uninterrupted for at least 15 minutes at a time.

All this sounds promising, but understanding the concept that your brain can hold off the aging process is a lot like buying a membership to the gym: It only helps if you actually use it. Which means incorporating these ideas into your everyday experience will require a tiny bit of intention on your part. If you’ve been reading this while also listening to the news on television—an example of combining activities that require cognitive function and thus working out your ability to pay attention—then you’ve already got a good head start.

—–

This article has been republished here with permission from Rewire Me. Rewire Me is a place for mutual inspiration; a resource to enlighten and guide us on our journey toward wholeness and balance.  

http://www.dailygood.org/story/868/7-ways-to-exercise-your-brain-and-why-you-really-need-to-michele-rosenthal/

===========================================================

9. 
THIS MONTH’S LINKS:

     THINK COMPUTER GRAPHICS CAN’T MIMIC HUMAN BEINGS?

http://digg.com/video/this-is-ed-he-isnt-real?utm_source=digg&utm_medium=email

     COMMON ‘MYTH-CONCEPTIONS’…

http://www.stumbleupon.com/su/2rMv97/2AAAEd!M:4kzws+D/www.informationisbeautiful.net/visualizations/common-mythconceptions-worlds-most-contagious-falsehoods

     A ‘VILLAGE’ HELPS SENIORS STAY IN THEIR HOMES COMFORTABLY…

http://www.northoaklandvillage.org/

http://www.northoaklandvillage.org/content.aspx?page_id=22&club_id=190478&module_id=110963

     INTERACTIVE VISUAL HISTORY OF WORLD 3500 BC to 2005 AD…

http://www.timemaps.com/history

ASTRONAUT CHRIS HADFIELD’S PHOTOS FROM SPACE…

http://qz.com/288018/astronaut-chris-hadfield-took-45000-jaw-dropping-photos-from-space-here-are-some-of-the-best/?utm_source=hootsuite&utm_medium=TN&utm_campaign=social

===========================================================

To subscribe email FatherWilliam@ThirdAgeCenter.com with “Subscribe” in the Subject line. Thank you.

===========================================================

To unsubscribe email FatherWilliam@ThirdAgeCenter.com with “Unsubscribe” in the Subject line. Thank you.

===========================================================

© Copyright 2002-2014, The Center for Third Age Leadership, except where indicated otherwise. All rights reserved worldwide. Reprint only with permission from copyright holder(s). All trademarks are property of their respective owners. All contents provided as is. No express or implied income claims made herein. This newsletter is available by subscription only. We neither use nor endorse the use of spam.

Please feel free to use excerpts from this newsletter as long as you give credit with a link to our page: www.ThirdAgeCenter.com. Thank you!

===========================================================

FW Radio Show: “We’ve Got Our Own Third Age Bard”

If you’re relaxed enough to enjoy one of FW’s hour long radio shows, “We’ve Got Our Own Third Age Bard” is worth hearing. It’s an interview with old friend David Swords who’s just retired after a long and successful career in publishing. David shares with us some of what he’s running up against when his days are now so relatively empty – and we get to hear him sing the first new song he’s written in this new, open and somewhat confusing space. You can download the show here:

https://www.dropbox.com/s/gvao7o2ig7azbax/FW-AM%2014-09-03%20David%20%26%20Savior%20PT%201.mp3?dl=0


Newsletter – September 2014

===========================================================
 
THE CENTER FOR THIRD AGE NEWSLETTER – SEPTEMBER 2014
 

1.  FATHER WILLIAM’S MUSINGS
 
2.  NINETY SIX WORDS FOR LOVE
 
3.  OCCUPATIONAL HAZARDS OF WORKING ON WALL STREET
 
4.  PREPARING MY DAUGHTERS TO BE SINGLE
 
5.  THE CASE FOR DELAYED ADULTHOOD
 
6.  THE DEATH OF ADULTHOOD IN AMERICAN CULTURE
 
7.  HOW THE CULTURE OF WATCHING IS CHANGING US
 
8.  WHY I JUST ASKED MY STUDENTS TO PUT THEIR LAPTOPS AWAY
 
9.  THIS MONTH’S LINKS
 
===========================================================
 
QUOTES OF THE MONTH – WILLIAM DERSHOWITZ, RAM DASS & JOHN ENRIGHT

 
     “Is there any way I can avoid becoming an entitled little shit?
 
     “To heal the world emanate love, not hate.
 
     “‘Judge not that ye be not judged does not mean that if you judge others, they will judge you back. It means that whatever you do outwardly, you will do with greater intensity to yourself inwardly. If you want to stop being so hard on yourself, stop being so judgmental of others.
 
===========================================================
 
1.  FATHER WILLIAM’S MUSINGS
 
September greetings, Dear Friends…
 
About 30 years ago on a personal retreat I wrote a piece called “YOU ARE WHERE YOU PUT YOURSELF.”  a play on the 70’s slogan “YOU ARE WHAT YOU EAT.” It was, in my late 40s, a reflection upon and codification of a profound and humbling understanding I finally arrived at – We really do become the environments we put ourselves in. This recollection was inspired by all the articles included in this month’s newsletter, especially Michael Lewis’ piece “OCCUPATIONAL HAZARDS OF WORKING ON WALL STREET” which calls for a repeat of William Dershowitz’ quote from last month’s newsletter:
 
     “Is there any way I can avoid becoming an entitled little shit?”
 
The Lewis’ article expands Dershowitz’s “entitled little shit” disease from elite education to soul-corroding careers – and beyond. We do literally become where we put ourselves. And where others put us.
 
Increasingly these environments are becoming more virtual than real, more digitized brainwashing than physical coercion. In a scanning tonight’s headlines and political cartoons on the web, I recognized how addicted I am to infecting my spirits with images of negativity presented to me in “the news.” Finally I am making a connection to the wisdom of “love thy neighbor” that has been around me all my life. Ram Dass put it this way in this recent quote from Gratitude.org…
 
     “To heal the world emanate love, not hate.”
 
Most of what I ingest as “news” emanates hate, not love. More importantly I, after reading and watching such “news,” am more prone to emanate hate myself.
 
It does seem we humans are programmed to attend to drama. As far back as I can remember, before there was any technology except dial-up telephone and AM radio, I, and almost everyone I knew, was drawn to drama. What occurs to me now is to be interested in the few who were not. How many of those few who approach non-judgmentalness can I remember clearly?
 
Beverly in high school comes to mind. Dick in college. Harlan in my first job. Sarah, Barney  and Stef as I began my teaching career. It’s hard to remember many from the late 60s through the 70s, an environment of ego-driven activism and righteousness for me and those I was trying so hard to emulate and be valued by.
 
Why? Because, the righteous activist environment I put myself in surrounded me by the company of others who were cynically judgmental as well. And I didn’t just choose their company, I chose them as audience I needed to impress. That choice was a source of hate-emanation that has stuck with me for a number of decades.
 
For all of the effort I put into becoming a spiritual and loving being, it is only now at 76 years of age that I see what I’m writing about here. I have had the words and concepts since my first communion at six – and I have managed to misunderstand and misuse them as judgments for all this time. One of my great teachers, John Enright, said this to me in 1971:
 
     “‘Judge not that ye be not judged’ does not mean that if you judge others, they will judge you back. It means that whatever you do outwardly, you will do with greater intensity to yourself inwardly. If you want to stop being so hard on yourself, stop being so judgmental of others.”
 
Forty-three years later I’m beginning to transform this concept into behavior.
 
So what could I possibly offer to a younger me from this perspective? What could I say or write to my younger self that he could hear? I have no idea, but I suspect many older folks offer little because we can’t imagine anyone getting through to us “back then” – when we were so caught up trying to fit “the right way to be…”
 
And once again the Universe provides – I’m reading a marvelous novel right now, “Where the Rehoku Bone Sings”  by Tina Makeretti . It is about the tribal history of the Moriori and the Maori in New Zealand, and it is not a pretty history. The Moriori, a people committed to peace and emanating love, created and lived in a unique culture on Rehoku, now known as the Chatham Islands. Two Maori tribes, overpowered on the New Zealand mainland by other warring tribes, invaded Rehoku in the 1800’s. The Morioris  welcome them with love. The invading Maoris, whose history was one of dominate or be dominated, enslaved those Moriori they didn’t exterminate. This novel is not just a history of injustice and hate-emanation, but opens a totally different way of thinking about human beings and their possibilities. I cannot recommend it highly enough, especially to those of us who have known only cultures that believe they must dominate or be dominated.
 
In the book an ancestor, Imi, watches from the “no-time” space as his descendants try to cope with the now “civilized world” they find themselves in – and rediscover parts of their history that might bring them back to the possibilities of “emanating love.” Their journey is helping me with mine even though their roots are not mine to return to. But, along with their struggles, Ram Dass’ offering of “emanate love” and my maturing, I, too, am feeling possibilities of returning to “emanating love.” It is a feeling that is making sense of everything for me now.
 
So, while focusing September’s newsletter on the dangers of unawareness in choosing our environments, let’s begin with “NINETY-SIX WORDS FOR LOVE”…
 
Much love, FW
 
PS: Because some of the articles are long, I’m trying to give you a glimpse of why I find them valuable for me; this is in the form of an ‘FW NOTE’ at the beginning of each – and remember, this is meant to be a month’s worth relaxed reading…

PS#2: You’ll notice we’re using a new format – we’d welcome any feedback you have…
 
www.FatherWilliam.org
 
===========================================================
 
2.  NINETY SIX WORDS FOR LOVE

 
     BY ROBERT JOHNSON, AWAKIN.ORG, SEPTEMBER 17, 2014
 
FW NOTE:  I greatly respect the unconscious power of words – 76 years of experience have made clear to me that “the way we phrase our lives is the way we end up living them.” And I also admire Robert Johnson who wrote many books, among them HE, SHE, WE and TRANSFORMATION. The trio helped me greatly reduce my Masculine-Feminine confusion, and the latter gave me another glimpse of how First, Second and Third Ages, rightly understood, can light our way home to “elderhood”.
 
The first difficulty we meet in discussing anything concerning our feelings is that we have no adequate vocabulary to use. Where there is no terminology, there is no consciousness. A poverty-stricken vocabulary is an immediate admission that the subject is inferior or depreciated in that society.
 
Sanskrit has ninety-six words for love; ancient Persian has eighty, Greek three, and English only one. This is indication to me of the poverty of awareness or emphasis that we give to that tremendously important realm of feeling. Eskimos have thirty words for snow, because it is a life-and death matter to them to have exact information about the element they live with so intimately. If we had a vocabulary of thirty words for love … we would immediately be richer and more intelligent in this human element so close to our heart. An Eskimo probably would die of clumsiness if he had only one word for snow; we are close to dying of loneliness because we have only one word for love. Of all the Western languages, English may be the most lacking when it comes to feeling.
 
Imagine what richness would be expressed if one had a specific vocabulary for the love of one’s father, another word for the love of one’s mother, yet another for one’s camel (the Persians have this luxury), still another for another’s spouse, and another exclusively for the sunset! Our world would expand and gain clarity immeasurably if we had such tools.
 
It is always the inferior function, whether in an individual or a culture, that suffers this poverty. One’s greatest treasures are won by the superior function but always at the cost of the inferior function. One’s greatest triumphs are always accompanied by one’s greatest weaknesses. Because thinking is our superior function in the English-speaking world, it follows automatically that feeling is our inferior function. These two faculties tend to exist at the expense of each other. If one is strong in feeling, one is likely to be inferior in thinking — and vice versa. Our superior function has given us science and the higher standard of living — but at the cost of impoverishing the feeling function.
 
This is vividly demonstrated by our meager vocabulary of feeling words. If we had the expanded and exact vocabulary for feeling that we have for science and technology, we would be well on our way to warmth of relatedness and generosity of feeling.
 
http://www.awakin.org/index.php?op=show_email
 
===========================================================
 
3.  OCCUPATIONAL HAZARDS OF WORKING ON WALL STREET

 
     BY MICHAEL LEWIS, BLOOMBERG VIEW, SEPT. 24, 2014
 
FW NOTE:  As I said in my Musings, all the articles included here caused me to reflect again on “WE BECOME WHERE WE PUT OURSELVES,” this one in particular. For a current example of powerfully seductive, and dangerously self-destructive, certain environments can be, read on…
 
A few times in the past several decades it has sounded as if big Wall Street banks were losing their hold on the graduates of the world’s most selective universities: the early 1990s, the dot-com boom and the immediate aftermath of the global financial crisis (Teach for America!). Each time the graduating class of Harvard and Yale looked as if it might decide, en masse, that it wanted to do something with its life other than work for Morgan Stanley.
 
Each time it turned out that it didn’t.
 
Silicon Valley is once again bubbling, and, in response, big Wall Street banks are raising starting salaries, and reducing the work hours of new recruits. But it’s hard to see why this time should be any different from the others.
 
Technology entrepreneurship will never have the power to displace big Wall Street banks in the central nervous system of America’s youth, in part because tech entrepreneurship requires the practitioner to have an original idea, or at least to know something about computers, but also because entrepreneurship doesn’t offer the sort of people who wind up at elite universities what a lot of them obviously crave: status certainty.
 
“I’m going to Goldman,” is still about as close as it gets in the real world to “I’m going to Harvard,” at least for the fiercely ambitious young person who is ambitious to do nothing in particular.
 
The question I’ve always had about this army of young people with seemingly endless career options who wind up in finance is: What happens next to them? People like to think they have a “character,” and that this character of theirs will endure, no matter the situation. It’s not really so. People are vulnerable to the incentives of their environment, and often the best a person can do, if he wants to behave in a certain manner, is to choose carefully the environment that will go to work on his character.
 
One moment this herd of graduates of the nation’s best universities are young people — ambitious yes, but still young people — with young people’s ideals and hopes to live a meaningful life. The next they are essentially old people, at work gaming ratings companies, and designing securities to fail so they might make a killing off the investors they dupe into buying them, and rigging various markets at the expense of the wider society, and encouraging all sorts of people to do stuff with their capital and their companies that they never should do.
 
Not everyone on Wall Street does stuff that would have horrified them, had it been described to them in plain English, when they were 20. But enough do that it makes you wonder. What happens between then and now?
 
All occupations have hazards. An occupational hazard of the Internet columnist, for instance, is that he becomes the sort of person who says whatever he thinks will get him the most attention rather than what he thinks is true, so often that he forgets the difference.
 
The occupational hazards of Wall Street are more interesting — and not just because half the graduating class of Harvard still wants to work there. Some are obvious — for instance, the temptation, when deciding how to behave, to place too much weight on the very short term and not enough on the long term. Or the temptation, if you make a lot of money, to deploy financial success as an excuse for failure in other aspects of your life. But some of the occupational hazards on Wall Street are less obvious.
 
Here’s a few that seem, just now, particularly relevant:
 
— Anyone who works in finance will sense, at least at first, the pressure to pretend to know more than he does.
 
It’s not just that people who pick stocks, or predict the future price of oil and gold, or select targets for corporate acquisitions, or persuade happy, well-run private companies to go public don’t know what they are talking about: what they pretend to know is unknowable. Much of what Wall Street sells is less like engineering than like a forecasting service for a coin-flipping contest — except that no one mistakes a coin-flipping contest for a game of skill. To succeed in this environment you must believe, or at least pretend to believe, that you are an expert in matters where no expertise is possible. I’m not sure it’s any easier to be a total fraud on Wall Street than in any other occupation, but on Wall Street you will be paid a lot more to forget your uneasy feelings.
 
— Anyone who works in big finance will also find it surprisingly hard to form deep attachments to anything much greater than himself.
 
You may think you are going to work for Credit Suisse or Barclays, and will there join a team of professionals committed to the success of your bank, but you will soon realize that your employer is mostly just a shell for the individual ambitions of the people who inhabit it. The primary relationship of most people in big finance is not to their employer but to their market. This simple fact resolves many great Wall Street mysteries. An outsider looking in on the big Wall Street banks in late 2008, for instance, might ask, “How could all these incredibly smart and self-interested people have come together and created collective suicide?” More recently the same outsider might wonder, “Why would a trader rig Libor, or foreign exchange rates, or the company’s dark pool, when the rewards for the firm are so trivial compared with the cost, if he is caught? Why, for that matter, wouldn’t some Wall Street bank set out to rat out the bad actors in their market, and set itself as the honest broker?”
 
The answer is that the people who work inside the big Wall Street firms have no serious stake in the long-term fates of their firms. If the place blows up they can always do what they are doing at some other firm — so long as they have maintained their stature in their market. The quickest way to lose that stature is to alienate the other people in it. When you see others in your market doing stuff at the expense of the broader society, your first reaction, at least early in your career, might be to call them out, but your considered reaction will be to keep mum about it. And when you see people making money in your market off some broken piece of internal machinery — say, gameable ratings companies, or riggable stock exchanges, or manipulable benchmarks — you will feel pressure not to fix the problem, but to exploit it.
 
— More generally, anyone who works in big finance will feel enormous pressure to not challenge or question existing arrangements.
 
One of our financial sector’s most striking traits is how fiercely it resists useful, disruptive entrepreneurship that routinely upends other sectors of our economy. People in finance are paid a lot of money to disrupt every sector of our economy. But when it comes to their own sector, they are deeply wary of market-based change. And they have the resources to prevent it from happening. To take one example: in any other industry, IEX, the new stock market created to eliminate a lot of unnecessary financial intermediation (and the subject of my last book) would have put a lot of existing players out of business. (And it still might.) The people who run IEX have very obviously found a way to make the U.S. stock market — and other automated financial markets — more efficient and, in the bargain, reduce, by some vast amount, the take of the financial sector. Because of this they now face what must be one of the best organized and funded smear campaigns outside of U.S. politics: underhanded attacks from anonymous Internet trolls, congressional hearings staged to obfuscate problems in the market, by senators who take money from the obfuscators; op-ed articles from prominent former regulators, now employed by the Wall Street machine, that spread outright lies about the upstarts; error-ridden pieces by prominent journalists too stupid or too lazy or too compromised to do anything but echo what they are told by the very people who make a fortune off the inefficiencies the entrepreneurs seek to eliminate.
 
The intense pressure to conform, to not make waves, has got to be the most depressing part of all, for a genuinely ambitious young person. It’s pretty clear that the government lacks the power to force serious change upon the financial sector. There’s a big role for Silicon Valley-style scorched-earth entrepreneurship on Wall Street right now, and the people most likely to innovate are newcomers to the industry who have no real stake in the parts of it that need scorching.
 
As a new employee on Wall Street you might think this has nothing to do with you. You would be wrong. Your new environment’s resistance to market forces, and to the possibility of doing things differently and more efficiently, will soon become your own. When you start your career you might think you are setting out to change the world, but the world is far more likely to change you.
 
So watch yourself, because no one else will.
 
http://www.bloombergview.com/articles/2014-09-24/occupational-hazards-of-working-on-wall-street?utm_source=digg&utm_medium=email
 
===========================================================
 
4.  PREPARING MY DAUGHTERS TO BE SINGLE

 
     BY KASEY EDWARDS, STUFF.CO.NZ/LIFE-STYLE, 14/09/2014

FW NOTE:  But what about the environments we are born into, the imprint us with their values before we can make any choices for ourselves? How do we recover from that cultural programming that shuts down possibilities out of our conscious awareness? In this piece, Kasey Edwards offers one map for helping our young open up those possibilities again…
 
LIFE ISN’T A WAITING ROOM: I want my daughters to be empowered to choose to be in a relationship because they want to, not because they feel they need to.
 
“Why does she put up with him?’”
 
It’s a question we’ve all asked about our friends’ relationships when we’ve heard about or seen their partners behaving badly.
 
And if I’m being completely honest, looking back on some of my past boyfriends, I wish I’d asked myself the same question.
 
Some women are trapped in toxic relationships (which is a separate issue entirely), but other women stay in them because they believe that a bad relationship – or even an okay one – is better than being single.

From Cinderella to The Bachelor, girls can’t escape the message that being single is the equivalent of life’s waiting room.
 
This can encourage women to stay in relationships when they probably shouldn’t and tolerate or excuse behaviours that they definitely shouldn’t.
 
I want my daughters to be empowered to choose to be in a relationship because they want to, not because they think they need to.
 
I’m determined to instill in them that being single is an acceptable, and even desirable, option.
 
The following are some small, everyday approaches of getting girls to practice the habits of independence.
 
1. CREATING OPPORTUNITIES WHERE THEY CAN LEARN TO ENTERTAIN THEMSELVES
 
Learning to enjoy your own company is powerful because you can make better choices about how you spend your time and who you spend it with. We have ‘quiet time’ everyday in our house where everybody spends an hour playing on their own.
 
Not only do I need the sanity break, it’s an opportunity for me to model the pleasure that can be found in solitude.
 
When my five year old tells me she’s bored I try to reframe it as an opportunity for her to learn to entertain herself.
 
2. INSTILLING THE VALUE OF FEMALE FRIENDSHIP
 
I want my girls to know that even though their dad and I love to spend time together, there will always be a special place in my life for my female friends.
 
Before my girls start dating, I want them to understand the importance of female friendship and how it needs to be nurtured and protected.
 
No matter how besotted they become with a boy, they should always make time for their female friends.
 
Boys will come and go but it’s our female friends who will bring us ice-cream and tissues in the middle of the night.
 
And if a man ever tries to isolate them from their friends they should see it as a neon warning sign for unhealthy possessiveness and controlling behaviour.
 
3. TEACHING THEM TO BE FINANCIALLY SECURE AND INDEPENDENT
 
Money is a pretty crappy reason to stay in a relationship. But it also exerts a strong pull.
 
I want my girls to understand that the best relationships are when both people are independent and choose to be together.
 
It’s important to teach them about budgeting and saving and sexually transmitted debt.
 
Young love can make people do stupid things with money, so before they’re love sick I want to caution them about the common pitfalls such as unwisely lending money to their lovers, prematurely buying joint assets, and buying overly extravagant gifts.
Ad Feedback
 
4. BUYING THEM A TOOL BELT AND TEACHING THEM HOW TO USE IT
 
‘I wish daddy was here, he’d know what to do,’ was my daughter’s response when the batteries went flat on her favourite toy.
 
It was a wake-up call to me to start modeling how to fix stuff.
 
You don’t have to go out and restore a car or take apart a computer, but changing a light bulb is enough to demonstrate that women and men can take charge and fix problems.
 
5. ENCOURAGING THEM TO EXPLORE THEIR SEXUALITY FROM AN EARLY AGE
 
The prevalence of ‘sluts’ and ‘whores’ in young adult literature and schoolyard banter is enough to make a feminist mother weep.
 
Our daughters learn early the same sexually oppressive messages that we learnt: that female sexuality is a prize to be given to (or taken by) a man.
 
I’ve seen it with friends who embarrass their young daughters by telling them to stop touching themselves because it’s ‘dirty down there’.
 
And so begins a pattern of lifelong shame.
 
Allowing girls to learn to self-satisfy empowers them to take charge of their own sexual desires – both with and without a partner.
 
* It’s possible my girls will choose female partners, but for the purpose of this article I’ll assume their future partners are male.
 
Kasey Edwards is a writer and bestselling author
 
http://www.stuff.co.nz/life-style/life/10497686/Preparing-my-daughters-to-be-single
 
===========================================================
 
5.  THE CASE FOR DELAYED ADULTHOOD

 
     BY LAURENCE STEINBERG, THE NEW YORK TIMES SUNDAY REVIEW, SEPT. 19, 2014
 
FW NOTE:  I hadn’t previously thought of “delaying adulthood” as an important part of the process of becoming an elder. Steinberg’s focus on the importance of extending “brain plasticity” (which occurs by delaying  adulthood for neurological reasons) gives me a radically different perspective on younger generations’ “laziness.”  And a more personal reason is you might also find this alteration of viewpoint helpful – especially if you have 20-somethings  who’ve moved back home and “aren’t taking on their fair share of ‘adult’ responsibilities”…
 
ONE of the most notable demographic trends of the last two decades has been the delayed entry of young people into adulthood. According to a large-scale national study conducted since the late 1970s, it has taken longer for each successive generation to finish school, establish financial independence, marry and have children. Today’s 25-year-olds, compared with their parents’ generation at the same age, are twice as likely to still be students, only half as likely to be married and 50 percent more likely to be receiving financial assistance from their parents.
 
People tend to react to this trend in one of two ways, either castigating today’s young people for their idleness or acknowledging delayed adulthood as a rational, if regrettable, response to a variety of social changes, like poor job prospects. Either way, postponing the settled, responsible patterns of adulthood is seen as a bad thing.
 
This is too pessimistic. Prolonged adolescence, in the right circumstances, is actually a good thing, for it fosters novelty-seeking and the acquisition of new skills.
 
Studies reveal adolescence to be a period of heightened “plasticity” during which the brain is highly influenced by experience. As a result, adolescence is both a time of opportunity and vulnerability, a time when much is learned, especially about the social world, but when exposure to stressful events can be particularly devastating. As we leave adolescence, a series of neurochemical changes make the brain increasingly less plastic and less sensitive to environmental influences. Once we reach adulthood, existing brain circuits can be tweaked, but they can’t be overhauled.
 
You might assume that this is a strictly biological phenomenon. But whether the timing of the change from adolescence to adulthood is genetically preprogrammed from birth or set by experience (or some combination of the two) is not known. Many studies find a marked decline in novelty-seeking as we move through our 20s, which may be a cause of this neurochemical shift, not just a consequence. If this is true — that a decline in novelty-seeking helps cause the brain to harden — it raises intriguing questions about whether the window of adolescent brain plasticity can be kept open a little longer by deliberate exposure to stimulating experiences that signal the brain that it isn’t quite ready for the fixity of adulthood.
 
Evolution no doubt placed a biological upper limit on how long the brain can retain the malleability of adolescence. But people who can prolong adolescent brain plasticity for even a short time enjoy intellectual advantages over their more fixed counterparts. Studies have found that those with higher I.Q.s, for example, enjoy a longer stretch of time during which new synapses continue to proliferate and their intellectual development remains especially sensitive to experience. It’s important to be exposed to novelty and challenge when the brain is plastic not only because this is how we acquire and strengthen skills, but also because this is how the brain enhances its ability to profit from future enriching experiences.
 
With this in mind, the lengthy passage into adulthood that characterizes the early 20s for so many people today starts to look less regrettable. Indeed, those who can prolong adolescence actually have an advantage, as long as their environment gives them continued stimulation and increasing challenges.
 
What do I mean by stimulation and challenges? The most obvious example is higher education, which has been shown to stimulate brain development in ways that simply getting older does not. College attendance pays neural as well as economic dividends.
 
Naturally, it is possible for people to go to college without exposing themselves to challenge, or, conversely, to surround themselves with novel and intellectually demanding experiences in the workplace. But generally, this is more difficult to accomplish on the job than in school, especially in entry-level positions, which typically have a learning curve that hits a plateau early on.
 
Alas, something similar is true of marriage. For many, after its initial novelty has worn off, marriage fosters a lifestyle that is more routine and predictable than being single does. Husbands and wives both report a sharp drop in marital satisfaction during the first few years after their wedding, in part because life becomes repetitive. A longer period of dating, with all the unpredictability and change that come with a cast of new partners, may be better for your brain than marriage.
 
If brain plasticity is maintained by staying engaged in new, demanding and cognitively stimulating activity, and if entering into the repetitive and less exciting roles of worker and spouse helps close the window of plasticity, delaying adulthood is not only O.K.; it can be a boon.
 
Laurence Steinberg, a professor of psychology at Temple University, “Age of Opportunity: Lessons From the New Science of Adolescence.”
 
http://www.nytimes.com/2014/09/21/opinion/sunday/the-case-for-delayed-adulthood.html?_r=0
 
===========================================================
 
6.  THE DEATH OF ADULTHOOD IN AMERICAN CULTURE

 
     BY A. O. SCOTT, THE NEW YORK TIMES MAGAZINE/THE CULTURE ISSUE, SEPT. 11, 2014
 
FW NOTE:  This is a perfect follow-up to the previous piece. It expands and deepens the complexities of flourishing in both childhood and adult years, especially given the virtual brain-washings created by our highly digitized lives. I’ve believed I was relatively protected from media conditioning, but almost all of Scott’s examples have impacted me. Also, this is a good lead-in to the last two articles which are helping me recognize how pervasively distracting electronic gadgetry can be – even living at a retreat centre in the New Zealand bush.
 
Sometime this spring, during the first half of the final season of “Mad Men,” the popular pastime of watching the show — recapping episodes, tripping over spoilers, trading notes on the flawless production design, quibbling about historical details and debating big themes — segued into a parlor game of reading signs of its hero’s almost universally anticipated demise. Maybe the 5 o’clock shadow of mortality was on Don Draper from the start. Maybe the plummeting graphics of the opening titles implied a literal as well as a moral fall. Maybe the notable deaths in previous seasons (fictional characters like Miss Blankenship, Lane Pryce and Bert Cooper, as well as figures like Marilyn Monroe and Medgar Evers) were premonitions of Don’s own departure. In any case, fans and critics settled in for a vigil. It was not a matter of whether, but of how and when.
 
TV characters are among the allegorical figures of our age, giving individual human shape to our collective anxieties and aspirations. The meanings of “Mad Men” are not very mysterious: The title of the final half season, which airs next spring, will be “The End of an Era.” The most obvious thing about the series’s meticulous, revisionist, present-minded depiction of the past, and for many viewers the most pleasurable, is that it shows an old order collapsing under the weight of internal contradiction and external pressure. From the start, “Mad Men” has, in addition to cataloging bygone vices and fashion choices, traced the erosion, the gradual slide toward obsolescence, of a power structure built on and in service of the prerogatives of white men. The unthinking way Don, Pete, Roger and the rest of them enjoy their position, and the ease with which they abuse it, inspires what has become a familiar kind of ambivalence among cable viewers. Weren’t those guys awful, back then? But weren’t they also kind of cool? We are invited to have our outrage and eat our nostalgia too, to applaud the show’s right-thinking critique of what we love it for glamorizing.
 
TONY SOPRANO, WALTER WHITE & DON DRAPER ARE THE LAST OF THE PATRIARCHS.
 
The widespread hunch that “Mad Men” will end with its hero’s death is what you might call overdetermined. It does not arise only from the internal logic of the narrative itself, but is also a product of cultural expectations. Something profound has been happening in our television over the past decade, some end-stage reckoning. It is the era not just of mad men, but also of sad men and, above all, bad men. Don is at once the heir and precursor to Tony Soprano, that avatar of masculine entitlement who fended off threats to the alpha-dog status he had inherited and worked hard to maintain. Walter White, the protagonist of “Breaking Bad,” struggled, early on, with his own emasculation and then triumphantly (and sociopathically) reasserted the mastery that the world had contrived to deny him. The monstrousness of these men was inseparable from their charisma, and sometimes it was hard to tell if we were supposed to be rooting for them or recoiling in horror. We were invited to participate in their self-delusions and to see through them, to marvel at the mask of masculine competence even as we watched it slip or turn ugly. Their deaths were (and will be) a culmination and a conclusion: Tony, Walter and Don are the last of the patriarchs.
 
In suggesting that patriarchy is dead, I am not claiming that sexism is finished, that men are obsolete or that the triumph of feminism is at hand. I may be a middle-aged white man, but I’m not an idiot. In the world of politics, work and family, misogyny is a stubborn fact of life. But in the universe of thoughts and words, there is more conviction and intelligence in the critique of male privilege than in its defense, which tends to be panicky and halfhearted when it is not obtuse and obnoxious. The supremacy of men can no longer be taken as a reflection of natural order or settled custom.
 
This slow unwinding has been the work of generations. For the most part, it has been understood — rightly in my view, and this is not really an argument I want to have right now — as a narrative of progress. A society that was exclusive and repressive is now freer and more open. But there may be other less unequivocally happy consequences. It seems that, in doing away with patriarchal authority, we have also, perhaps unwittingly, killed off all the grown-ups.
 
A little over a week after the conclusion of the first half of the last “Mad Men” season, the journalist and critic Ruth Graham published a polemical essay in Slate lamenting the popularity of young-adult fiction among fully adult readers. Noting that nearly a third of Y.A. books were purchased by readers ages 30 to 44 (most of them presumably without teenage children of their own), Graham insisted that such grown-ups “should feel embarrassed about reading literature for children.” Instead, these readers were furious. The sentiment on Twitter could be summarized as “Don’t tell me what to do!” as if Graham were a bossy, uncomprehending parent warning the kids away from sugary snacks toward more nutritious, chewier stuff.
 
It was not an argument she was in a position to win, however persuasive her points. To oppose the juvenile pleasures of empowered cultural consumers is to assume, wittingly or not, the role of scold, snob or curmudgeon. Full disclosure: The shoe fits. I will admit to feeling a twinge of disapproval when I see one of my peers clutching a volume of “Harry Potter” or “The Hunger Games.” I’m not necessarily proud of this reaction. As cultural critique, it belongs in the same category as the sneer I can’t quite suppress when I see guys my age (pushing 50) riding skateboards or wearing shorts and flip-flops, or the reflexive arching of my eyebrows when I notice that a woman at the office has plastic butterfly barrettes in her hair.
 
God, listen to me! Or don’t. My point is not so much to defend such responses as to acknowledge how absurd, how impotent, how out of touch they will inevitably sound. In my main line of work as a film critic, I have watched over the past 15 years as the studios committed their vast financial and imaginative resources to the cultivation of franchises (some of them based on those same Y.A. novels) that advance an essentially juvenile vision of the world. Comic-book movies, family-friendly animated adventures, tales of adolescent heroism and comedies of arrested development do not only make up the commercial center of 21st-century Hollywood. They are its artistic heart.
 
Meanwhile, television has made it very clear that we are at a frontier. Not only have shows like “The Sopranos” and “Mad Men” heralded the end of male authority; we’ve also witnessed the erosion of traditional adulthood in any form, at least as it used to be portrayed in the formerly tried-and-true genres of the urban cop show, the living-room or workplace sitcom and the prime-time soap opera. Instead, we are now in the age of “Girls,” “Broad City,” “Masters of Sex” (a prehistory of the end of patriarchy), “Bob’s Burgers” (a loopy post-“Simpsons” family cartoon) and a flood of goofy, sweet, self-indulgent and obnoxious improv-based web videos.
 
ADULTHOOD AS WE HAVE KNOWN IT HAS BECOME CONCEPTUALLY UNTENABLE.
 
What all of these shows grasp at, in one way or another, is that nobody knows how to be a grown-up anymore. Adulthood as we have known it has become conceptually untenable. It isn’t only that patriarchy in the strict, old-school Don Draper sense has fallen apart. It’s that it may never really have existed in the first place, at least in the way its avatars imagined. Which raises the question: Should we mourn the departed or dance on its grave?
 
Before we answer that, an inquest may be in order. Who or what killed adulthood? Was the death slow or sudden? Natural or violent? The work of one culprit or many? Justifiable homicide or coldblooded murder?
 
We Americans have never been all that comfortable with patriarchy in the strict sense of the word. The men who established our political independence — guys who, for the most part, would be considered late adolescents by today’s standards (including Benjamin Franklin, in some ways the most boyish of the bunch) — did so partly in revolt against the authority of King George III, a corrupt, unreasonable and abusive father figure. It was not until more than a century later that those rebellious sons became paternal symbols in their own right. They weren’t widely referred to as Founding Fathers until Warren Harding, then a senator, used the phrase around the time of World War I.

From the start, American culture was notably resistant to the claims of parental authority and the imperatives of adulthood. Surveying the canon of American literature in his magisterial “Love and Death in the American Novel,” Leslie A. Fiedler suggested, more than half a century before Ruth Graham, that “the great works of American fiction are notoriously at home in the children’s section of the library.” Musing on the legacy of Rip Van Winkle and Huckleberry Finn, he broadened this observation into a sweeping (and still very much relevant) diagnosis of the national personality: “The typical male protagonist of our fiction has been a man on the run, harried into the forest and out to sea, down the river or into combat — anywhere to avoid ‘civilization,’ which is to say the confrontation of a man and woman which leads to the fall to sex, marriage and responsibility. One of the factors that determine theme and form in our great books is this strategy of evasion, this retreat to nature and childhood which makes our literature (and life!) so charmingly and infuriatingly ‘boyish.’ ”
 
Huck Finn is for Fiedler the greatest archetype of this impulse, and he concludes “Love and Death” with a tour de force reading of Twain’s masterpiece. What Fiedler notes, and what most readers of “Huckleberry Finn” will recognize, is Twain’s continual juxtaposition of Huck’s innocence and instinctual decency with the corruption and hypocrisy of the adult world.
 
Huck’s “Pap” is a thorough travesty of paternal authority, a wretched, mean and dishonest drunk whose death is among the least mourned in literature. When Huck drifts south from Missouri, he finds a dysfunctional patriarchal order whose notions of honor and decorum mask the ultimate cruelty of slavery. Huck’s hometown represents “the world of belongingness and security, of school and home and church, presided over by the mothers.” But this matriarchal bosom is as stifling to Huck as the land of Southern fathers is alienating. He finds authenticity and freedom only on the river, in the company of Jim, the runaway slave, a friend who is by turns Huck’s protector and his ward.
 
The love between this pair repeats a pattern Fiedler discerned in the bonds between Ishmael and Queequeg in “Moby-Dick” and Natty Bumppo and Chingachgook in James Fenimore Cooper’s Leatherstocking novels (which Twain famously detested). What struck Fiedler about these apparently sexless but intensely homoerotic connections was their cross-cultural nature and their defiance of heterosexual expectation. At sea or in the wilderness, these friends managed to escape both from the institutions of patriarchy and from the intimate authority of women, the mothers and wives who represent a check on male freedom.
 
WHAT HAPPENS TO BOY REBELS WHEN THE DREAM OF PERPETUAL CHILDHOOD FADES?
 
Fiedler saw American literature as sophomoric. He lamented the absence of books that tackled marriage and courtship — for him the great grown-up themes of the novel in its mature, canonical form. Instead, notwithstanding a few outliers like Henry James and Edith Wharton, we have a literature of boys’ adventures and female sentimentality. Or, to put it another way, all American fiction is young-adult fiction.
 
The elevation of the wild, uncivilized boy into a hero of the age remained a constant even as American society itself evolved, convulsed and transformed. While Fiedler was sitting at his desk in Missoula, Mont., writing his monomaniacal tome, a youthful rebellion was asserting itself in every corner of the culture. The bad boys of rock ‘n’ roll and the pouting screen rebels played by James Dean and Marlon Brando proved Fiedler’s point even as he was making it. So did Holden Caulfield, Dean Moriarty, Augie March and Rabbit Angstrom — a new crop of semi-antiheroes in flight from convention, propriety, authority and what Huck would call the whole “sivilized” world.

From there it is but a quick ride on the Pineapple Express to Apatow. The Updikean and Rothian heroes of the 1960s and 1970s chafed against the demands of marriage, career and bureaucratic conformity and played the games of seduction and abandonment, of adultery and divorce, for high existential stakes, only to return a generation later as the protagonists of bro comedies. We devolve from Lenny Bruce to Adam Sandler, from “Catch-22” to “The Hangover,” from “Goodbye, Columbus” to “The Forty-Year-Old Virgin.”
 
But the antics of the comic man-boys were not merely repetitive; in their couch-bound humor we can detect the glimmers of something new, something that helped speed adulthood to its terminal crisis. Unlike the antiheroes of eras past, whose rebellion still accepted the fact of adulthood as its premise, the man-boys simply refused to grow up, and did so proudly. Their importation of adolescent and preadolescent attitudes into the fields of adult endeavor (see “Billy Madison,” “Knocked Up,” “Step Brothers,” “Dodgeball”) delivered a bracing jolt of subversion, at least on first viewing. Why should they listen to uptight bosses, stuck-up rich guys and other readily available symbols of settled male authority?
 
That was only half the story, though. As before, the rebellious animus of the disaffected man-child was directed not just against male authority but also against women. In Sandler’s early, funny movies, and in many others released under Apatow’s imprimatur, women are confined to narrowly archetypal roles. Nice mommies and patient wives are idealized; it’s a relief to get away from them and a comfort to know that they’ll take care of you when you return. Mean mommies and controlling wives are ridiculed and humiliated. Sexually assertive women are in need of being shamed and tamed. True contentment is only found with your friends, who are into porn and “Star Wars” and weed and video games and all the stuff that girls and parents just don’t understand.
 
The bro comedy has been, at its worst, a cesspool of nervous homophobia and lazy racial stereotyping. Its postures of revolt tend to exemplify the reactionary habit of pretending that those with the most social power are really beleaguered and oppressed. But their refusal of maturity also invites some critical reflection about just what adulthood is supposed to mean. In the old, classic comedies of the studio era — the screwbally roller coasters of marriage and remarriage, with their dizzying verbiage and sly innuendo — adulthood was a fact. It was inconvertible and burdensome but also full of opportunity. You could drink, smoke, flirt and spend money. The trick was to balance the fulfillment of your wants with the carrying out of your duties.
 
The desire of the modern comic protagonist, meanwhile, is to wallow in his own immaturity, plumbing its depths and reveling in its pleasures. Sometimes, as in the recent Seth Rogen movie “Neighbors,” he is able to do that within the context of marriage. At other, darker times, say in Adelle Waldman’s literary comedy of manners, “The Love Affairs of Nathaniel P.,” he will remain unattached and promiscuous, though somewhat more guiltily than in his Rothian heyday, with more of a sense of the obligation to be decent. It should be noted that the modern man-boy’s predecessors tended to be a lot meaner than he allows himself to be.
 
UNLIKE PAST ANTI-HEROES, MODERN MAN-BOYS SIMPLY REFUSED TO GROW UP & DID SO PROUDLY.
 
But they also, at least some of the time, had something to fight for, a moral or political impulse underlying their postures of revolt. The founding brothers in Philadelphia cut loose a king; Huck Finn exposed the dehumanizing lies of America slavery; Lenny Bruce battled censorship. When Marlon Brando’s Wild One was asked what he was rebelling against, his thrilling, nihilistic response was “Whaddaya got?” The modern equivalent would be “. . .”
 
Maybe nobody grows up anymore, but everyone gets older. What happens to the boy rebels when the dream of perpetual childhood fades and the traditional prerogatives of manhood are unavailable? There are two options: They become irrelevant or they turn into Louis C. K. Every white American male under the age of 50 is some version of the character he plays on “Louie,” a show almost entirely devoted to the absurdity of being a pale, doughy heterosexual man with children in a post-patriarchal age. Or, if you prefer, a loser.
 
The humor and pathos of “Louie” come not only from the occasional funny feelings that he has about his privileges — which include walking through the city in relative safety and the expectation of sleeping with women who are much better looking than he is — but also, more profoundly, from his knowledge that the conceptual and imaginative foundations of those privileges have crumbled beneath him. He is the center of attention, but he’s not entirely comfortable with that. He suspects that there might be other, more interesting stories around him, funnier jokes, more dramatic identity crises, and he knows that he can’t claim them as his own. He is above all aware of a force in his life, in his world, that by turns bedevils him and gives him hope, even though it isn’t really about him at all. It’s called feminism.
 
Who is the most visible self-avowed feminist in the world right now? If your answer is anyone other than Beyoncé, you might be trying a little too hard to be contrarian. Did you see her at the V.M.A.’s, in her bejeweled leotard, with the word “feminist” in enormous illuminated capital letters looming on the stage behind her? A lot of things were going on there, but irony was not one of them. The word was meant, with a perfectly Beyoncé-esque mixture of poise and provocation, to encompass every other aspect of her complicated and protean identity. It explains who she is as a pop star, a sex symbol, the mother of a daughter and a partner in the most prominent African-American power couple not currently resident in the White House.
 
And while Queen Bey may be the biggest, most self-contradicting, most multitude-containing force in popular music at the moment, she is hardly alone. Taylor Swift recently described how, under the influence of her friend Lena Dunham, she realized that “I’ve been taking a feminist stance without saying so,” which only confirmed what anyone who had been listening to her smart-girl power ballads already knew. And while there will continue to be hand-wringing about the ways female singers are sexualized — cue the pro and con think pieces about Nicki Minaj, Katy Perry, Miley Cyrus, Iggy Azalea, Lady Gaga, Kesha and, of course, Madonna, the mother of them all — it is hard to argue with their assertions of power and independence. Take note of the extent and diversity of that list and feel free to add names to it. The dominant voices in pop music now, with the possible exception of rock, which is dad music anyway, belong to women. The conversations rippling under the surfaces of their songs are as often as not with other women — friends, fans,
 
Similar conversations are taking place in the other arts: in literature, in stand-up comedy and even in film, which lags far behind the others in making room for the creativity of women. But television, the monument valley of the dying patriarchs, may be where the new cultural feminism is making its most decisive stand. There is now more and better television than there ever was before, so much so that “television,” with its connotations of living-room furniture and fixed viewing schedules, is hardly an adequate word for it anymore. When you look beyond the gloomy-man, angry-man, antihero dramas that too many critics reflexively identify as quality television — “House of Cards,” “Game of Thrones,” “True Detective,” “Boardwalk Empire,” “The Newsroom” — you find genre-twisting shows about women and girls in all kinds of places and circumstances, from Brooklyn to prison to the White House. The creative forces behind these programs are often women who have built up the muscle and the résumés to do what they want.
 
Many people forget that the era of the difficult TV men, of Tony and Don and Heisenberg, was also the age of the difficult TV mom, of shows like “Weeds,” “United States of Tara,” “The Big C” and “Nurse Jackie,” which did not inspire the same level of critical rapture partly because they could be tricky to classify. Most of them occupied the half-hour rather than the hourlong format, and they were happy to swerve between pathos and absurdity. Were they sitcoms or soap operas? This ambiguity, and the stubborn critical habit of refusing to take funny shows and family shows as seriously as cop and lawyer sagas, combined to keep them from getting the attention they deserved. But it also proved tremendously fertile.
 
WHY SHOULD BOYS BE THE ONLY ONES WITH THE RIGHT TO REVOLT?
 
The cable half-hour, which allows for both the concision of the network sitcom and the freedom to talk dirty and show skin, was also home to “Sex and the City,” in retrospect the most influential television series of the early 21st century. “Sex and the City” put female friendship — sisterhood, to give it an old political inflection — at the center of the action, making it the primary source of humor, feeling and narrative complication. “The Mary Tyler Moore Show” and its spinoffs did this in the 1970s. But Carrie and her girlfriends could be franker and freer than their precursors, and this made “Sex and the City” the immediate progenitor of “Girls” and “Broad City,” which follow a younger generation of women pursuing romance, money, solidarity and fun in the city.
 
Those series are, unambiguously, comedies, though “Broad City” works in a more improvisational and anarchic vein than “Girls.” Their more inhibited broadcast siblings include “The Mindy Project” and “New Girl.” The “can women be funny?” pseudo-debate of a few years ago, ridiculous at the time, has been settled so decisively it’s as if it never happened. Tina Fey, Amy Poehler, Amy Schumer, Aubrey Plaza, Sarah Silverman, Wanda Sykes: Case closed. The real issue, in any case, was never the ability of women to get a laugh but rather their right to be as honest as men.
 
And also to be as rebellious, as obnoxious and as childish. Why should boys be the only ones with the right to revolt? Not that the new girls are exactly Thelma and Louise. Just as the men passed through the stage of sincere rebellion to arrive at a stage of infantile refusal, so, too, have the women progressed by means of regression. After all, traditional adulthood was always the rawest deal for them.
 
Which is not to say that the newer styles of women’s humor are simple mirror images of what men have been doing. On the contrary. “Broad City,” with the irrepressible friendship of the characters played by Ilana Glazer and Abbi Jacobson at its center, functions simultaneously as an extension and a critique of the slacker-doofus bro-posse comedy refined (by which I mean exactly the opposite) by “Workaholics” or the long-running web-based mini-sitcom “Jake and Amir.” The freedom of Abbi and Ilana, as of Hannah, Marnie, Shoshanna and Jessa on “Girls” — a freedom to be idiotic, selfish and immature as well as sexually adventurous and emotionally reckless — is less an imitation of male rebellion than a rebellion against the roles it has prescribed. In Fiedler’s stunted American mythos, where fathers were tyrants or drunkards, the civilizing, disciplining work of being a grown-up fell to the women: good girls like Becky Thatcher, who kept Huck’s pal Tom Sawyer from going too far astray; smothering maternal figures like the kind but repressive Widow Douglas; paragons of sensible judgment like Mark Twain’s wife, Livy, of whom he said he would “quit wearing socks if she thought them immoral.”
 
Looking at those figures and their descendants in more recent times — and at the vulnerable patriarchs lumbering across the screens to die — we can see that to be an American adult has always been to be a symbolic figure in someone else’s coming-of-age story. And that’s no way to live. It is a kind of moral death in a culture that claims youthful self-invention as the greatest value. We can now avoid this fate. The elevation of every individual’s inarguable likes and dislikes over formal critical discourse, the unassailable ascendancy of the fan, has made children of us all. We have our favorite toys, books, movies, video games, songs, and we are as apt to turn to them for comfort as for challenge or enlightenment.
 
Y.A. fiction is the least of it. It is now possible to conceive of adulthood as the state of being forever young. Childhood, once a condition of limited autonomy and deferred pleasure (“wait until you’re older”), is now a zone of perpetual freedom and delight. Grown people feel no compulsion to put away childish things: We can live with our parents, go to summer camp, play dodge ball, collect dolls and action figures and watch cartoons to our hearts’ content. These symptoms of arrested development will also be signs that we are freer, more honest and happier than the uptight fools who let go of such pastimes.
 
I do feel the loss of something here, but bemoaning the general immaturity of contemporary culture would be as obtuse as declaring it the coolest thing ever. A crisis of authority is not for the faint of heart. It can be scary and weird and ambiguous. But it can be a lot of fun, too. The best and most authentic cultural products of our time manage to be all of those things. They imagine a world where no one is in charge and no one necessarily knows what’s going on, where identities are in perpetual flux. Mothers and fathers act like teenagers; little children are wise beyond their years. Girls light out for the territory and boys cloister themselves in secret gardens. We have more stories, pictures and arguments than we know what to do with, and each one of them presses on our attention with a claim of uniqueness, a demand to be recognized as special. The world is our playground, without a dad or a mom in sight.
 
I’m all for it. Now get off my lawn.
 
A. O. Scott is a chief film critic for The Times. He last wrote for the magazine about a crazy thing that happened on Twitter.
 
http://www.nytimes.com/2014/09/14/magazine/the-death-of-adulthood-in-american-culture.html?action=click&contentCollection=Opinion&module=MostEmailed&version=Full&region=Marginalia&src=me&pgtype=article
 
===============================================
 
7.  NINE WAYS THE CULTURE OF WATCHING IS CHANGING US

 
     BY BARRY CHUDAKOV, SYNDICATED FROM REWIREME.COM, SEPTEMBER 11, 2014
 
FW NOTE:  Chudakov’s opening sentence says just what it is he writes about, and I gained much insight from his nine specific ways in which we are being changed…
 
Our constant use of cameras, TVs, computers, and smart devices is affecting our thoughts and behavior to a degree we may not even realize…
 
Watching and being watched are no longer confined to how newborns bond with their mothers or apprentice chefs learn from sushi masters. Watching now changes how we identify ourselves and how others understand us. “Selfies” are not an anomaly; they are personal reflections of a wholesale adoption of the new culture of watching. We are watching so many—and so many are watching us in so many different places and ways—that watching and being watched fundamentally alter how we think and behave.
 
While 50% of our neural tissue is directly or indirectly related to vision, it is only in the last 100 years that image-delivery technologies (cameras, TVs, computers, smart devices) arrived. Here is a list of some ways all this watching is changing us.
 
1. THE MORE WE WATCH, THE MORE WE BELIEVE WATCHING IS NECESSARY—AND THE MORE WE INVENT REASONS TO WATCH.
 
Today the average person will have spent nine years of their life doing something that is not an essential human endeavor: watching other people, often people they don’t know. I’m talking, of course, about watching TV.
 
When asked to choose between watching TV and spending time with their fathers, 54% of 4- to 6-year-olds in the U.S. preferred television. The average American youth spends 900 hours a year in school and 1,200 hours a year watching TV.
 
In Korea today there are eating broadcasts, called muk-bang: online channels streaming live feeds of people eating large quantities of food while chatting with viewers who pay to watch them.
 
A survey of first-time plastic surgery patients found that 78% were influenced by reality television and 57% of all first-time patients were “high–intensity” viewers of cosmetic-surgery reality TV.
 
We watch housewives and Kardashians, TED talks and LOL cats. We watch people next to us (via the Android I-Am app) and people in 10-second “snaps” anywhere an IP address finds them (via Snapchat). The more we watch, the less we notice how much we’re watching. It seems it’s not only what we’re watching but the act of watching itself that beguiles us. The more devices and screens we watch, the more we rationalize our watching, give it precedence in our lives, tell ourselves it has meaning and purpose. We are redefining—and rewiring—ourselves in the process. This is the new (and very seductive) culture of watching.
 
In Japan’s Osaka train station—where an average of 413,000 passengers board trains every day—an independent research agency will soon deploy 90 cameras and 50 servers to watch and track faces as they move around the station. The purpose: to validate the safety of emergency exits in the event of a disaster. The technology can identify faces with a 99.99% accuracy rate.
 
2. WATCHING BUILDS AND TRANSMITS CULTURE.
 
We watch to learn. Evolutionary eons have taught us to watch to learn where we are, what is around us, what we need to pay attention to, where danger and excitement lurk. “Watching others is a favorite activity of young primates,” says Frans de Waal, one of the world’s leading primate behavior experts. This is how we build and transmit culture, he explains.
 
What are we learning from all this watching?
 
Thanks to wifi built into almost anything with a lens, we are learning to share what we watch. Jonah Berger, Wharton Associate Professor of Marketing at the University of Pennsylvania, looked at video sharing and created an “arousal index,” explaining that “physiological arousal is characterized by activation of the autonomic nervous system, and the mobilization caused by this excitory state may boost sharing.” Google Think Insights calls the YouTube generation Generation C for connection, community, creation, curation: 50% of Gen C talk to friends after watching a video, and 38% share videos on an additional social network after watching them on YouTube. As
we watch emotionally charged content, our bodies—specifically, our autonomic nervous system—are compelled to share.
 
3. WATCHING TAKES UN INTO RELATIONSHIPS AND ACTIONS WHERE WE ARE NOT PHYSICALLY PRESENT–AND THIS FUNDAMENTALLY ALTERS WHAT EXPERIENCE MEANS.
 
The experience of playing baseball, launching a missile attack, getting trapped in a mudslide, or chasing Maria Menounos is far different from watching those things. Yet now that we can watch almost anything—often while it happens—we must consider the neuroscience of “mirroring” that occurs when we watch others.
 
When our eyes are open, vision accounts for two-thirds of the electrical activity of the brain. But it is our mirror neurons—which V. S. Ramachandran, distinguished professor of neuroscience at the University of California, San Diego, calls “the basis of civilization”—that transport watching into the strange territory of being in an action where we’re not physically present.
 
As Le Anne Schreiber wrote in This Is Your Brain on Sports:
 
“[A]bout one-fifth of the neurons that fire in the premotor cortex when we perform an action (say, kicking a ball) also fire at the sight of somebody else performing that action. A smaller percentage fire even when we only hear a sound associated with an action (say, the crack of a bat). This subset of motor neurons that respond to others’ actions as if they were our own are called ‘mirror neurons,’ and they seem to encode a complete archive of all the muscle movements we learn to execute over the course of our lives, from the first smile and finger wag to a flawless triple toe loop.”
 
When we watch, we feel we’re there.
 
4. WATCHING REPLACES HUMAN FRIENDS AND COMPANIONS—WE NOW HAVE MANY SIGNIFICANT OTHERS WE DO NOT KNOW.
 
It appears that the idea of having some sense of a relationship with people who are not physically present, whom you do not know (in the conventional sense of having met them or being friends with them), arrived with the widespread adoption of television around 1950. Since then, these so-called parasocial relationships have become so common that we take them for granted. Television, virtual worlds, and gaming have created replacements for friends: people who occupy space in our media rooms and minds on an occasional basis.
 
Researchers now believe that loneliness motivates individuals to seek out these relationships, defying the obvious fact that the relationships are not real. The Real Housewives of Atlanta has 2,345,625 Facebook fans, who in some measure take real housewives into their own real lives.
 
People who watched a favorite TV show when they were feeling lonely reported feeling less lonely while watching. Further, while many of us experience lower self-esteem and a negative mood following a fight or social rejection, researchers found that those participants who experienced a relationship threat and then watched their favorite TV show were actually buffered against the blow to self-esteem, negative mood, and feelings of rejection.
 
It pays to have friends on TV.
 
5. WATCHING BLURS THE LINES BETWEEN SELF AND OTHER, MERGING THE WATCHER AND THE WATCHED.

From micro video security cameras (“less than one inch square”) to The Rich Kids of Beverly Hills, watching is now someone’s business plan. Eyeball-hungry producers especially want to blur the boundaries between the game of reality TV and the illusion of living real lives.
 
The result: Watch culture alters not only our sense of privacy in public; there is always someone in the vanity mirror looking back at us. (Author Jarod Kintz quipped: “A mirror is like my own personal reality TV show—where I’m both the star and only viewer. I’ve got to get my ratings up.”) As cameras obsessively follow other lives, our identity adjusts.Rather than acknowledge the artifice of lives deliberately programmed for storylines and conflicts—the lifeblood of so-called reality TV— we fuse our emotions and concerns with others’ professions, houses, cars, friends, husbands, and wives.
 
When watching assumes greater importance, the people we watch become personal replacements; they stand in our places and we in theirs. Models, stars, and athletes are the body doubles of watch culture. These doubles become our bodies: according to WebMD, reality television is contributing to eating disorders in girls. Since the boom of reality television in 2000, eating disorders in teenage girls (ages 13-19) have nearly tripled.
 
New technologies make us all paparazzi. 20 Day Stranger, an app developed by the MIT Media Lab Playful Systems research group and MIT’s Dalai Lama Center for Ethics and Transformative Values, makes it possible to swap lives with—and watch—a stranger for 20 days:
 
“As you and your distant partner get up and go to work or school or wherever else the world takes you, the app tracks your path, pulling related photos from Foursquare or Google Maps along the way. If you stop in a certain coffee shop, the app will find a picture someone took there, and send it to your partner.”
 
Ostensibly designed to “build empathy and awareness,” 20 Day Stranger delivers snackable images via smartphone, which strokes your inner voyeur while enabling yet another person to watch you and “slowly get an impression of [your] life.”
 
When Shain Gandee, star of MTV’s Buckwild, died, his vehicle stuck deep in a mud pit, Huffington Post’s Jesse Washington asked, “Was Gandee living for the cameras that night or for himself?”
 
This watcher-watched merger is growing uneasy. Many a real housewife—from Atlanta to Orange County—may begin to wonder: Whose life is it, anyway?
 
6. WATCHING REDEFINES INTIMACY.
 
Professor Simon Louis Lajeunesse of the University of Montreal wanted to compare the behavior of men who viewed sexually explicit material with those who had never seen it. He had to drastically rethink his study after failing to find any male volunteers who had never watched porn.
 
The hallmark of watch culture is the remove. In the snug blind of the Internet or from the private places we take our devices, we are hidden, removed from interaction while watching action. Because we can now watch anonymously, we have opened a Pandora’s box of previously hidden urges. In such interactions, we are seeing a new kind of affinity: what researchers call “intimacy at a distance.”
 
In this faux intimacy, watching easily turns to spying. As our lenses take us to parts and pores we could have barely imagined only a generation ago, the urge to watch is so compelling that we adopt its logic—as we do with all our tools—and we easily move from watching what we can see to watching what we could see. With a camera in the baby’s room I could watch the nanny; with a camera on the third floor I could watch the clones in Accounting to see if they’re up to any funny business. Economic or security intentions ensure that this slope hardly feels slippery; we move down it easily, seamlessly slipping from watching to spying to invading and then to destroying—what others thought were their personal moments and what many of us consider as—privacy.
 
7. WATCHING ALTERS AND OFTEN ELIMINATES BOUNDARIES.
 
When we don’t know, we watch.
 
After the disappearance of Malaysia Airlines Flight 370, commentator Michael Smerconish and others argued that video should be fed in real time out of every airline cockpit to help investigators. Of course, pilots are in a professional class that is unique. But today there are many businesses where security and confidentiality are paramount. How long before we apply the “learn by watching” logic to software engineers or doctors? We have already applied it to all our public and commercial spaces.
 
With the array of gadgetry available to us all, it is virtually impossible not to want to see anything. The new culture of watching overcomes time and space and takes precedence over moral and ethical boundaries.
 
8. WATCHING REALITY CHANGES IT.
 
Watching not only changes our narratives—what we say about the world; it changes what we know and how we know it. Pew recently reported that we get more of our information now from watching news (via TV and mobile devices) than from any other method. But “information” in this sense is now affected by—even mixed up with—the other watching we do. Writing on CNN Opinion, Carol Costello asked, “Why are we still debating climate change?” In 2013, 10,883 out of 10,885 scientific articles agreed: Global warming is happening, and humans are to blame. Citing lack of public confidence in these scientists, Costello wrote:
 
“Most Americans can’t even name a living scientist. I suspect the closest many Americans get to a living, breathing scientist is the fictional Dr. Sheldon Cooper from CBS’s sitcom The Big Bang Theory. Sheldon is brilliant, condescending, and narcissistic. Whose trust would he inspire?”
 
There is a logic here that is difficult to understand rationally but is operative nonetheless: What we know is not what we experience but what we watch.
 
9. THE MORE WE WATCH, THE MORE WATCHERS WATCH US.
 
We watch housewives and Kardashians, TED talks and LOL cats. We watch people next to us (via the Android I-Am app) and people in 10-second “snaps” anywhere an IP address finds them (via Snapchat). The more we watch, the less we notice how much we’re watching.
 
So it is not surprising that watching boomerangs—creating watchers who watch us back from hidden or out-of-sightline cameras. Watchers monitor our faces and bodies coming and going in convenience stores, gas stations, banks, department stores, and schools. Newly formed companies have created thriving businesses watching people “passing through doorways, passageways or in open areas” to count them, track them, and analyze what can be seen from an “unlimited number of cameras.”
 
Even driving to the store you’re being watched, via your license plate.
 
Ironically, the culture of watching will compel us—sooner or later—to keep watch: to be mindful of how much we watch and how much all this watching changes us. That may be the best way to detect and positively affect what is happening right before our eyes.
 
Barry Chudakov is a Founder and principal of Sertain Research, author of Metalifestream & The Tool That Tells the Story; market researcher, brand and media consultant.
 
http://www.dailygood.org/story/854/9-ways-the-culture-of-watching-is-changing-us-barry-chudakov/
 
===========================================================
 
8.  WHY I JUST ASKED MY STUDENTS TO PUT THEIR LAPTOPS AWAY

 
     BY CLAY SHIRKY, MEDIUM.COM/@CSHIRKY, SEPTEMBER 9, 2014
 
FW NOTE:  As a teacher myself, I identify with Clay Shirky and his struggle with policing media use in class and at home. But what got my attention the most was this quote: “The fact that hardware and software is being professionally designed to distract was the first thing that made me willing to require rather than merely suggest that students not use devices in class.” His illumination of the consequences and costs of digital-enabled ‘multi-tasking’ is of importance to all ages…
 
I teach theory and practice of social media at NYU, and am an advocate and activist for the free culture movement, so I’m a pretty unlikely candidate for internet censor, but I have just asked the students in my fall seminar to refrain from using laptops, tablets, and phones in class.
 
I came late and reluctantly to this decision — I have been teaching classes about the internet since 1998, and I’ve generally had a laissez-faire attitude towards technology use in the classroom. This was partly because the subject of my classes made technology use feel organic, and when device use went well, it was great. Then there was the competitive aspect — it’s my job to be more interesting than the possible distractions, so a ban felt like cheating. And finally, there’s not wanting to infantilize my students, who are adults, even if young ones — time management is their job, not mine.
 
Despite these rationales, the practical effects of my decision to allow technology use in class grew worse over time. The level of distraction in my classes seemed to grow, even though it was the same professor and largely the same set of topics, taught to a group of students selected using roughly the same criteria every year. The change seemed to correlate more with the rising ubiquity and utility of the devices themselves, rather than any change in me, the students, or the rest of the classroom encounter.
 
Over the years, I’ve noticed that when I do have a specific reason to ask everyone to set aside their devices (‘Lids down’, in the parlance of my department), it’s as if someone has let fresh air into the room. The conversation brightens, and more recently, there is a sense of relief from many of the students. Multi-tasking is cognitively exhausting — when we do it by choice, being asked to stop can come as a welcome change.
 
So this year, I moved from recommending setting aside laptops and phones to requiring it, adding this to the class rules: “Stay focused. (No devices in class, unless the assignment requires it.)” Here’s why I finally switched from ‘allowed unless by request’ to ‘banned unless required’.
 
We’ve known for some time that multi-tasking is bad for the quality of cognitive work, and is especially punishing of the kind of cognitive work we ask of college students.
 
This effect takes place over more than one time frame — even when multi-tasking doesn’t significantly degrade immediate performance, it can have negative long-term effects on “declarative memory”, the kind of focused recall that lets people characterize and use what they learned from earlier studying. (Multi-tasking thus makes the famous “learned it the day before the test, forgot it the day after” effect even more pernicious.)
 
People often start multi-tasking because they believe it will help them get more done. Those gains never materialize; instead, efficiency is degraded. However, it provides emotional gratification as a side-effect. (Multi-tasking moves the pleasure of procrastination inside the period of work.) This side-effect is enough to keep people committed to multi-tasking despite worsening the very thing they set out to improve.
 
On top of this, multi-tasking doesn’t even exercise task-switching as a skill. A study from Stanford reports that heavy multi-taskers are worse at choosing which task to focus on. (“They are suckers for irrelevancy”, as Cliff Nass, one of the researchers put it.) Multi-taskers often think they are like gym rats, bulking up their ability to juggle tasks, when in fact they are like alcoholics, degrading their abilities through over-consumption.
 
This is all just the research on multi-tasking as a stable mental phenomenon. Laptops, tablets and phones — the devices on which the struggle between focus and distraction is played out daily — are making the problem progressively worse. Any designer of software as a service has an incentive to be as ingratiating as they can be, in order to compete with other such services. “Look what a good job I’m doing! Look how much value I’m delivering!”
 
This problem is especially acute with social media, because on top of the general incentive for any service to be verbose about its value, social information is immediately and emotionally engaging. Both the form and the content of a Facebook update are almost irresistibly distracting, especially compared with the hard slog of coursework. (“Your former lover tagged a photo you are in” vs. “The Crimean War was the first conflict significantly affected by use of the telegraph.” Spot the difference?)
 
Worse, the designers of operating systems have every incentive to be arms dealers to the social media firms. Beeps and pings and pop-ups and icons, contemporary interfaces provide an extraordinary array of attention-getting devices, emphasis on “getting.” Humans are incapable of ignoring surprising new information in our visual field, an effect that is strongest when the visual cue is slightly above and beside the area we’re focusing on. (Does that sound like the upper-right corner of a screen near you?)
 
The form and content of a Facebook update may be almost irresistible, but when combined with a visual alert in your immediate peripheral vision, it is—really, actually, biologically—impossible to resist. Our visual and emotional systems are faster and more powerful than our intellect; we are given to automatic responses when either system receives stimulus, much less both. Asking a student to stay focused while she has alerts on is like asking a chess player to concentrate while rapping their knuckles with a ruler at unpredictable intervals.
 
Jonathan Haidt’s metaphor of the elephant and the rider is useful here. In Haidt’s telling, the mind is like an elephant (the emotions) with a rider (the intellect) on top. The rider can see and plan ahead, but the elephant is far more powerful. Sometimes the rider and the elephant work together (the ideal in classroom settings), but if they conflict, the elephant usually wins.
 
After reading Haidt, I’ve stopped thinking of students as people who simply make choices about whether to pay attention, and started thinking of them as people trying to pay attention but having to compete with various influences, the largest of which is their own propensity towards involuntary and emotional reaction. (This is even harder for young people, the elephant so strong, the rider still a novice.)
 
Regarding teaching as a shared struggle changes the nature of the classroom. It’s not me demanding that they focus — its me and them working together to help defend their precious focus against outside distractions. I have a classroom full of riders and elephants, but I’m trying to teach the riders.
 
And while I do, who is whispering to the elephants? Facebook, Wechat, Twitter, Instagram, Weibo, Snapchat, Tumblr, Pinterest, the list goes on, abetted by the designers of the Mac, iOS, Windows, and Android. In the classroom, it’s me against a brilliant and well-funded army (including, sharper than a serpent’s tooth, many of my former students.) These designers and engineers have every incentive to capture as much of my students’ attention as they possibly can, without regard for any commitment those students may have made to me or to themselves about keeping on task.
 
It doesn’t have to be this way, of course. Even a passing familiarity with the literature on programming, a famously arduous cognitive task, will acquaint you with stories of people falling into code-flow so deep they lose track of time, forgetting to eat or sleep. Computers are not inherent sources of distraction — they can in fact be powerful engines of focus — but latter-day versions have been designed to be, because attention is the substance which makes the whole consumer internet go.
 
The fact that hardware and software is being professionally designed to distract was the first thing that made me willing to require rather than merely suggest that students not use devices in class. There are some counter-moves in the industry right now — software that takes over your screen to hide distractions, software that prevents you from logging into certain sites or using the internet at all, phones with Do Not Disturb options — but at the moment these are rear-guard actions. The industry has committed itself to an arms race for my students’ attention, and if it’s me against Facebook and Apple, I lose.
 
The final realization — the one that firmly tipped me over into the “No devices in class” camp — was this: screens generate distraction in a manner akin to second-hand smoke. A paper with the blunt title Laptop Multitasking Hinders Classroom Learning for Both Users and Nearby Peers says it all:

We found that participants who multitasked on a laptop during a lecture scored lower on a test compared to those who did not multitask, and participants who were in direct view of a multitasking peer scored lower on a test compared to those who were not. The results demonstrate that multitasking on a laptop poses a significant distraction to both users and fellow students and can be detrimental to comprehension of lecture content.
 
I have known, for years, that the basic research on multi-tasking was adding up, and that for anyone trying to do hard thinking (our spécialité de la maison, here at college), device use in class tends to be a net negative. Even with that consensus, however, it was still possible to imagine that the best way to handle the question was to tell the students about the research, and let them make up their own minds.
 
The “Nearby Peers” effect, though, shreds that rationale. There is no laissez-faire attitude to take when the degradation of focus is social. Allowing laptop use in class is like allowing boombox use in class — it lets each person choose whether to degrade the experience of those around them.
 
Groups also have a rider-and-elephant problem, best described by Wilfred Bion in an oddly written but influential book, Experiences in Groups. In it, Bion, who practiced group therapy, observed how his patients would unconsciously coordinate their actions to defeat the purpose of therapy. In discussing the ramifications of this, Bion observed that effective groups often develop elaborate structures, designed to keep their sophisticated goals from being derailed by more primal group activities like gossiping about members and vilifying non-members.
 
The structure of a classroom, and especially a seminar room, exhibits the same tension. All present have an incentive for the class to be as engaging as possible; even though engagement often means waiting to speak while listening to other people wrestle with half-formed thoughts, that’s the process by which people get good at managing the clash of ideas. Against that long-term value, however, each member has an incentive to opt out, even if only momentarily. The smallest loss of focus can snowball, the impulse to check WeChat quickly and then put the phone away leading to just one message that needs a reply right now, and then, wait, what happened last night??? (To the people who say “Students have always passed notes in class”, I reply that old-model notes didn’t contain video and couldn’t arrive from anywhere in the world at 10 megabits a second.)
 
I have the good fortune to teach in cities richly provisioned with opportunities for distraction. Were I a 19-year-old planning an ideal day in Shanghai, I would not put “Listen to an old guy talk for an hour” at the top of my list. (Vanity prevents me from guessing where it would go.) And yet I can teach the students things they are interested in knowing, and despite all the literature on joyful learning, from Marie Montessori on down, some parts of making your brain do new things are just hard.
 
Indeed, college contains daily exercises in delayed gratification. “Discuss early modern European print culture” will never beat “Sing karaoke with friends” in a straight fight, but in the long run, having a passable Rhianna impression will be a less useful than understanding how media revolutions unfold.
 
Anyone distracted in class doesn’t just lose out on the content of the discussion, they create a sense of permission that opting out is OK, and, worse, a haze of second-hand distraction for their peers. In an environment like this, students need support for the better angels of their nature (or at least the more intellectual angels), and they need defenses against the powerful short-term incentives to put off complex, frustrating tasks. That support and those defenses don’t just happen, and they are not limited to the individual’s choices. They are provided by social structure, and that structure is disproportionately provided by the professor, especially during the first weeks of class.
 
This is, for me, the biggest change — not a switch in rules, but a switch in how I see my role. Professors are at least as bad at estimating how interesting we are as the students are at estimating their ability to focus. Against oppositional models of teaching and learning, both negative—Concentrate, or lose out!—and positive—Let me attract your attention!—I’m coming to see student focus as a collaborative process. It’s me and them working to create a classroom where the students who want to focus have the best shot at it, in a world increasingly hostile to that goal.
 
Some of the students will still opt out, of course, which remains their prerogative and rightly so, but if I want to help the ones who do want to pay attention, I’ve decided it’s time to admit that I’ve brought whiteboard markers to a gun fight, and act accordingly.
 
https://medium.com/@cshirky/why-i-just-asked-my-students-to-put-their-laptops-away-7f5f7c50f368
 
===========================================================
 
9.  THIS MONTH’S LINKS:
 
     500 PEOPLE DANCING IN THE SKY – THIS IS AMAZING!

 
https://www.youtube-nocookie.com/embed/8oqPR5-GLuA?rel=0  
 
     MEDICARE COST PROJECTIONS AND REALITY…
 
http://www.nytimes.com/interactive/2014/08/26/upshot/100000003076821.embedded.html?_r=0
 
     THE 20 MOST PEACEFUL COUNTRIES IN THE WORLD…
 
http://travel.amerikanki.com/most-peaceful-countries-in-the-world/
 
     THE 1%’S OBSCENE & GROWING SHARE OF ALL U.S. WEALTH…
 
http://www.dailykos.com/story/2014/09/26/1332513/–The-most-important-chart-about-the-American-economy-you-ll-see-this-year?detail=email
 
     PETER FUNT FOLLOWS IN HIS FATHER’S FOOTSTEPS WITH CANDID CAMERA…
 
http://www.nytimes.com/2014/09/27/opinion/curses-fooled-again.html?action=click&contentCollection=Opinion&module=MostEmailed&version=Full&region=Marginalia&src=me&pgtype=article
 
===========================================================
 
To subscribe email FatherWilliam@ThirdAgeCenter.com  with “Subscribe” in the Subject line. Thank you.
 
===========================================================
 
To unsubscribe email FatherWilliam@ThirdAgeCenter.com with “Unsubscribe” in the Subject line. Thank you.
 
===========================================================
 
© Copyright 2002-2014, The Center for Third Age Leadership, except where indicated otherwise. All rights reserved worldwide. Reprint only with permission from copyright holder(s). All trademarks are property of their respective owners. All contents provided as is. No express or implied income claims made herein. This newsletter is available by subscription only. We neither use nor endorse the use of spam.
 
Please feel free to use excerpts from this newsletter as long as you give credit with a link to our page: www.ThirdAgeCenter.com. Thank you!
 
===========================================================