Opening on Aug. 17, Sunrise Springs Integrative Wellness Resort, a 52-room spa resort in Santa Fe, focuses on “nature bathing,” the opportunity to dwell in nature as a stress reducer and energy booster.Daily activities include yoga, meditation, Native American rituals, therapeutic gardening and animal interactions such as chicken therapy, which is presented as a soothing activity that involves stroking a bird’s feathers.
An on-site greenhouse and kitchen garden will serve as showcases for gardening lessons and food sources for the restaurant. Guests are encouraged to unplug from their digital devices. They can seek health consultations with staff doctors and specialists in both Eastern and Western medicine.
“At Sunrise Springs, we encourage our guests to unplug, tune-in and actively engage in their lives,” said David Hans, a psychologist and the resort’s executive director, in a news release. Rates start at $675 per person per day, single occupancy, with a two-night minimum stay, including meals and activities.
Sunrise Springs has been one or another kind of destination for the decades I’ve lived in New Mexico. My wife and I had some delightful meals there in previous incarnations.
We feel no urgency to visit a wellness resort. If anything, we kind of count Lot 4, here, as achieving most of the same functions – though I haven’t done any “nature bathing” or poultry petting since I spent summers on my grandparents’ farm when I was a kid.
In 1985, The New York Times published a snippet of comforting news for self-conscious solo eaters. “Dining alone,” the newspaper reassured readers, “is no longer viewed as odd.” At the time, eating spaghetti and meatballs by yourself wasn’t exactly the norm. A second article, which ran only seven months later in the Times, chronicled the stigma of solo dinners.
Thirty years later, thanks to a range of social and cultural trends, eating alone has become less of an occasional exercise than a fact of life. Nearly half of all meals and snacks are now eaten in solitude, according to a new report by industry trade association the Food Marketing Institute. The frequency varies by meal — people are more likely to eat breakfast by themselves than lunch or dinner — but the popularity of solo dining is, no doubt, on the rise, and has been for some time…
Indeed, a 1999 survey found that the number of people who ate alone at least part of the time tripled between the 1960s and 1990s. By 2006, nearly 60 percent of Americans regularly ate on their own, according to the American Time Use Survey. Today, that number is even higher.
Breakfast has undergone the most significant transformation. Roughly 53 percent of all breakfasts are now eaten alone, whether at home, in the car, or at one’s desk, according to the latest report.
Lunch meanwhile is nearly as lonely these days. Some 45 percent of midday meals are had alone, according to the report.
Dinner is the only meal that is still largely communal. Roughly three quarters of all suppers are still eaten with others today. But even that is changing…
One of the clearest reasons for the shift is something that has been happening to American households, gradually, for decades: They have been getting smaller. Over the more than 40-year span between 1970 and 2012, the percentage of households that contained a single person grew from 17 percent to 27 percent, according to Census Bureau data.
“Only 13 percent of households had one person in them in the 1960s,” said Seifer, who credits marriage and family trends with the rise of the single person American household. “People are either delaying marriage or putting off the formation of families after they get married more and more these days.”
People are also eating alone because they’re pressed for time.
But for all the hoopla about braving the restaurant world alone, the breakfasts, lunches, and dinners being eaten without companions these days aren’t happening at fancy eateries or fast food chains. Most of them, in fact, are being eaten in the comfort of one’s home. What that has meant so far is more delivery, which has been a boon for services like Seamless, and prepared foods, like Trader Joes’ Indian meals, which are selling exceptionally well.
The food industry understands this, which is why restaurants across the country have signed up to delivery services in droves, and, in part, why companies like Maple, a delivery-only restaurant based in New York City, exist.
Work stresses and scheduling are part of the equation – in households with couples. The years my wife and I were both working demanded separate breakfasts. She left for work a couple hours earlier than I. Retirement for me made it easier for the two of us – and now that she’s retired, as well, we’ve managed to build a new schedule that allows for “convening” even when we’re not sharing the same tastes.
The “take-it-home” meals for one are a phenomenon we noticed a decade ago when we were silly enough to think we could afford to shop at Whole Foods. The space they dedicate to attractive take-out was a real surprise. We see the same process on a smaller scale at Sprouts – and just as much dedicated display space at our local Trader Joe’s.
Nothing we ever sample, of course. We both happen to be good cooks.
For decades, “family planning” was synonymous with contraception. The Guttmacher Institute — a prominent reproductive health think tank — stated that “controlling family timing and size can be a key to unlocking opportunities for economic success, education, and equality” for women. In fact, their most recent analysis concluded that effective contraception has contributed to increasing women’s earning power and narrowing the gender pay gap.
Whether for these reasons or not, studies have consistently demonstrated that many women are choosing to delay childbearing. The age of first birth for women in developed countries is now approaching 28 and the birth rate in the USA is at an all time low…it is important that more women become aware of the potential benefit of oocyte freezing. In a recent study called “Baby Budgeting,” one research group described this technique of freezing/storing eggs as a “technologic bridge” from a woman’s reproductive prime to her preferred conception age.
Today egg freezing has made it possible for women to truly “plan their family” by storing eggs for later use. The first successful pregnancy from frozen eggs was reported in 1986. But for decades the process remained very inefficient, requiring about 100 eggs for each successful pregnancy. Therefore, the procedure was considered experimental and primarily offered to women that were faced with chemotherapy, radiation, or other fertility-robbing treatments used to treat serious illnesses. But with the development of more effective techniques for freezing eggs; success rates in many centers using frozen eggs is nearly as good as it is with using fresh eggs.
As a result of this improvement in pregnancy rates, the American Society of Reproductive Medicine lifted the “experimental” label from egg freezing and began supporting its use for social (rather than medical) reasons…
For practical reasons, the process of creating a fertility plan should involve consideration of a woman’s current age, how many children she would like to have, and her ovarian reserve. Existing guidelines suggest that if a woman is in good health, younger than 31 with a normal ovarian reserve, she should wait and reevaluate her situation every one to three years. At the other end of the spectrum, if a woman is more than 38, she should consult with a board-certified reproductive endocrinologist to discuss her options.
The wider the range of choices available to a woman, the better. This doesn’t mean choices get easier – but, the ability to choose, to decide when or whether she has a pregnancy, offers a broader look at the life she wants to build.
The Atomic Bomb Dome preserves one of the only structures left standing in Hiroshima after the world’s first nuclear attack 70 years ago. It’s now a World Heritage Site.
Reuters offers one of their great visual galleries – of the atomic blasts, then – and what lives there, now. From the Wider Image.
It reads like the script from one of his horror films; a stolen head, burnt black candles and satanic symbols – but this week they became elements of Berlin police department’s latest case.
The head in question belonged to Friedrich Wilhelm Murnau, the director of the iconic early-20th-Century Dracula film adaptation ‘Nosferatu – a Symphony of Horror’, taken from his grave near the German capital.
And officers have have already turned their attention to Germany’s darker sects as they search for those who took the well-preserved body part from a grave site often scrawled with pentangles and other symbols of devil worship.
Murnau’s ‘Nosferatu’ was and remains one of the most important milestones in cinema.
Based upon Bram Stoker’s 1897 novel ‘Dracula’, it told the story of Count Orlok of the undead, and its moody scenes and clever camera angles influenced generations of fans and filmakers alike.
But death was at the heart of the movie and death has continued to stir the passions of vampire lovers ever since it was made in 1922.
Indeed, his own death in 1931 aged 41 was enough to elicit some fascination of its own: openly homosexual, he was engaging in oral sex with his 14-year-old Filipino houseboy on the Pacific Coast Highway at Santa Barbara when he lost concentration and slammed into a telegraph pole.
His corpse was embalmed and placed in a metal coffin, and the following year it was shipped to Germany for burial in Stahnsdorf’s south-west cemetery.
And down the years the lovers of the undead – goths, ghouls and living vampires who get their sexual thrills from the drinking of human blood – have made the pilgrimage to the grave of Murnau to pay their respects to a man…
Stahnsdorf cemetery warden Olaf Ihlefeldt found the head missing as he slid the lid of the coffin away while investigating minor damage he had spotted on mausoleum number 22.
‘The body is still in pretty good condition,’ he said.
Murnau’s head was still recognisable and had its hair and teeth, he added, ‘the last time I saw it‘.
RTFA for tidbits and collateral tales of Satanism, vampire cults and other slightly disturbing religious rationales for often-demented, sometimes fanciful behavior.
Good enough for today’s TV series.
I must admit when my parents convinced the head librarian of our neighborhood Carnegie Library that I – 8-years-old – had exhausted the offerings for teens and pre-teens and required an adult library card, I almost blew it when the first book I went to borrow was Bram Stoker’s “Dracula”.
Murnau’s “Nosferatu” has long been my favorite silent film. If you require a soundtrack, try the version by Werner Herzog, “Nosferatu the Vampyre”, starring Klaus Kinski and Isabelle Adjani.
Many Americans are tired of explaining things to idiots, particularly when the things in question are so painfully obvious, a new poll indicates.
According to the poll, conducted by the University of Minnesota’s Opinion Research Institute, while millions have been vexed for some time by their failure to explain incredibly basic information to dolts, that frustration has now reached a breaking point.
Of the many obvious things that people are sick and tired of trying to get through the skulls of stupid people, the fact that climate change will cause catastrophic habitat destruction and devastating extinctions tops the list, with a majority saying that they will no longer bother trying to explain this to cretins.
Coming in a close second, statistical proof that gun control has reduced gun deaths in countries around the world is something that a significant number of those polled have given up attempting to break down for morons.
Finally, a majority said that trying to make idiots understand why a flag that symbolizes bigotry and hatred has no business flying over a state capitol only makes the person attempting to explain this want to put his or her fist through a wall.
In a result that suggests a dismal future for the practice of explaining things to idiots, an overwhelming number of those polled said that they were considering abandoning such attempts altogether, with a broad majority agreeing with the statement, “This country is exhausting.”
The Borowitz Report nails it, once again. Though I’ve spent most of my adult life fighting to overcome several of the most significant stupidities in American culture I admit that retirement from work a number of years ago – combined with the Web offering avenues for activism that don’t involve getting into my pickup truck and driving to town – opened the door to a life of being a proper hermit.
Now that my wife has achieved early retirement, we get to be hermits together. Which suits both of us. I can holler at the idjits on television. Which I can turn off. I can harangue the nation via my personal blog. Which I can turn off.
We go for walks with Sheila the dog.
Can babies use iPads?
If you’ve ever viewed YouTube videos of infants and toddlers using iPads, then you know the answer is a resounding “Yes.”
But how are they using them?
To answer that question and others, a team of University of Iowa researchers set out to study more than 200 YouTube videos. Their paper is published in the proceedings of the CHI 2015 conference, the most prestigious in the field of human-computer interaction.
In the paper they write that their goal was to “provide a window into how these children are using tablets through an analysis of relevant YouTube videos.”
What they found was information that supports “opportunities for research and starting points for design.”
“By age two, 90 percent of the children in the videos had a moderate ability to use a tablet,” says Juan Pablo Hourcade, associate professor of computer science in the UI College of Liberal Arts and Sciences and lead author of the study. “Just over 50 percent of 12-to-17-month-old children in the videos had a moderate ability…”
He says that to his knowledge, other researchers have conducted surveys of the prevalence of tablet use by young children, however, the UI study is the first to study how infants and toddlers are actually using the devices…
Hourcade acknowledged the drawbacks of using unsolicited YouTube videos, such as not knowing the exact ages of the children pictured and that the children pictured were selected by their caregivers and may not be representative of the larger society. However, he says the researchers were able to estimate the ages of the children (two-thirds of the videos included the age) and observe a clear progression of successful performance linked to age that is consistent with developmental milestones…
He says he hopes that the study and others that follow will influence the development of apps that encourage interactive education for infants and toddlers. The apps he envisions might be similar to the social and interactive-like children’s programs currently found on public television.
Interesting stuff. I almost always end up supporting any sort of investigation that encourages early education.
My parents taught both my sister and me to read by the time we each were 4 years old. And we had plenty of reading material available for the following age group – and beyond. Speaking subjectively, it was a great advantage throughout school for each of us.