100%

Scanned image of the page. Keyboard directions: use + to zoom in, - to zoom out, arrow keys to pan inside the viewer.

Page Options

Download this Issue

Share

Something wrong?

Something wrong with this page? Report problem.

Rights / Permissions

This collection, digitized in collaboration with the Michigan Daily and the Board for Student Publications, contains materials that are protected by copyright law. Access to these materials is provided for non-profit educational and research purposes. If you use an item from this collection, it is your responsibility to consider the work's copyright status and obtain any required permission.

March 17, 2021 - Image 12

Resource type:
Text
Publication:
The Michigan Daily

Disclaimer: Computer generated plain text may have errors. Read more about this.

F

or anyone who, like me, is obsessed

with the British royal family, the

CBS broadcast of Oprah Winfrey’s

interview with Meghan Markle and

Prince Harry on Sunday night was merely

another episode in the Netflix series “The

Crown.” However, the interview had a

subtlety that transforms the royal family’s

drama into an experience that is relatable

to many Americans: a dysfunctional

family.

In an attempt to create common ground

following World War II, the monarchy

shifted its cultural focus toward children

and families. Consequently, when Prince

Charles was born in 1948, Queen Elizabeth

and Prince Phillip were portrayed as

models for how a contemporary family

should function. In every perceivable way,

they were a young, powerful and flawless

family. Seeing the opportunity, the British

government used Queen Elizabeth and

her family to promote the idea that family

connects the British people to one another.

For the family emphasis to follow,

the royal family was made into the

archetype for all British families to

strive for. Significant events such as

Queen Elizabeth’s coronation in 1953

intentionally centered on family. Being the

first televised coronation, over 27 million

British people watched the ceremony,

which included shots of Prince Charles

and Princess Anne. British children were

given commemorative coronation mugs

to ensure every family felt personally

included in the celebration.

However, in the decades that have

followed, the public has become aware

of various fractures and scandals within

the royal family, with none being more

infamous than the clash between Princess

Diana and Prince Charles. Their messy

divorce further damaged the facade of

the happy family when public opinion

of the royal family was at a low due to

accusations of numerous marital affairs

and extravagant lifestyles. The tell-

all book “Diana: Her True Story” and

several television interviews described

the inevitable collapse of the Prince and

Princess of Wales’ marriage and how her

cries for help with her mental health were

ignored. BBC documentaries were no

longer enough to fill the cracks that had

appeared in the world’s most iconic family.

But in its own unique way, the royal family

has modernized with the rest of the world.

During the two-hour special, Meghan

and Harry revealed a heavily flawed

family, which stands in stark contrast to

the image of the royal family projected

during the 1950s. Altogether, the lack of

support and understanding from the rest

of the royal family — in regards to Meghan

and Harry’s requests for assistance

with their mental health and security —

demonstrates a clear example of a divided

family.

Meghan Markle and Prince Harry’s

decision to speak publicly about their

situation does a lot more than just expose

the royal family’s internal affairs. More

importantly, their interview testifies

to the importance of discussing your

own family dynamics. In many ways,

Markle’s experience parallels that of

Princess Diana. Both were brought into

the family that treated them unfairly,

which led them to speak out and call

attention to the circumstances they

struggled with. By doing so, Meghan

and Diana demonstrated the importance

of holding the monarchy accountable

for their actions, regardless of familial

relationships.

In America, family life has changed

dramatically,
with
an
increasing

number of children from single-parent

households and families becoming

smaller overall. Due to these shifts, a

dominant family form no longer exists in

America as it did in the 1960s. Research

shows that the structure of modern

families can be tied to the existence of

dysfunctional families. While not all

single-parent or blended families are

dysfunctional, they are less likely to have

“standard” relationships due to outside

factors, including social environment

and inability to provide adequate

childcare.

Dysfunction within a family setting

can take multiple forms, including mental

and physical abuse, yet many children

are unaware their family environment is

not considered standard. Unfortunately,

these
conditions
have
long-lasting

effects on children, including low self-

esteem, absence of identity and difficulty

cultivating relationships. Furthermore,

children from dysfunctional families

regularly justify their parent’s actions and

are never taught signs of unhealthy family

dynamics because this is rarely discussed.

In the United States, there remains a

preference for households of two parents

in their first marriage with multiple

children. Yet, as divorce and single

parenting have become more socially

acceptable, the typical suburban family

with a white picket fence is no longer a

realistic model. Consequently, neither is a

family that gets along perfectly with one

another.

By revealing the undercurrents of

the royal family, Meghan Markle and

Prince Harry’s interview has helped

normalize the discussion of family

dynamics — whether good or bad. For

too long, people have hidden away

their experiences with their family

by adhering to the cliché that blood

is thicker than water. Openly talking

about family, regardless of structure or

dynamic, should be more common and

acceptable. Continuing the belief that

the only respectable household form is

one dating back to the post-World War II

era is outdated and harmful for children.

Having more frequent and open

conversations
allows
children
and

teenagers to learn acceptable treatment

from
family
members.
Moreover,

discussions about the reality of family

erode the social stigma of being raised

in an imperfect family. Hearing people

around you only talk about the good parts

of their family creates a false sense of the

lives others live — which can be further

augmented if one has to return to a home

that seems the opposite. The family

experience you portray to your friends

and peers should be representative of your

reality, not merely what you believe the

rest of society wants to hear.

A

fter Sens. Jon Ossoff, D-Ga.,
and
Raphael
Warnock,

D-Ga., won their Georgia

runoff races in early January and
gave Democrats a slim majority in the
Senate, advocates of raising the federal
minimum wage saw an opportunity
to finally pass a wage hike. However,
those hopes were quickly dashed
when the Senate voted 58-42 against
a proposal sponsored by Sen. Bernie
Sanders, I-Vt., to increase the wage to
$15 an hour.

This vote not only deals a major blow

to those rallying behind a minimum

wage hike, but could also force President

Joe Biden to make compromises with his

his promise to raise the minimum wage,

unless he can influence Republicans

and moderate Democrats like Sen. Joe

Manchin, D-W.Va.

Regardless of the Senate’s rejection

of a $15 minimum wage, it’s obvious

that a minimum wage increase is

warranted in certain areas across the

country. A recent poll conducted by

Monmouth University finds that 53% of

Americans support raising the federal

minimum wage to $15 an hour. The

minimum wage has been fixed at $7.25

an hour since 2009, and our country

has experienced inflation since then. At

the same time, our problems of wealth

and income inequality have worsened;

according to the Pew Research Center,

the wealthiest Americans are getting

richer fastest while lower and middle-

income households are falling behind.

With sky-high costs of living in many

locations, raising the minimum wage

would lower poverty levels and seems

like a common-sense measure.

But while raising the minimum

wage would be beneficial on a number

of fronts, there are also issues with

raising the federal minimum wage

to $15, as evidenced by the bipartisan

vote against it in the Senate. One

of the clearest arguments against a

minimum wage increase is that it would

precipitate broad job losses across the

entire economy. In a recent report, the

Congressional Budget Office estimated

that a hike to $15 would slash a total

of 1.4 million jobs, reducing national

employment levels by almost 1% due to

employee layoffs, signifying a notable

hit to the labor force. As the economy

begins to recover from the disastrous

COVID-19 pandemic, a minimum

wage increase could undermine much-

needed economic growth by forcing

over a million people out of work.

There are valid arguments for

raising the wage to $15 or keeping it as

is, and we should give all viewpoints

attention. But what is indisputable is

that select areas of the country would

benefit from a minimum wage hike of

some sort. Instead of waiting for our

representatives and senators to sort out

their policy differences in Congress —

something that could take years — each

state should address its own minimum

wage.

Handing over responsibility on the

minimum wage to states makes sense

on a lot of levels. Individual states

already have the power to regulate their

own minimum wages, although there

currently isn’t a great incentive to do so

since the federal government is trying to

control the minimum wage for the entire

nation. Encouraging states to address

their own minimum wages would offer

a major boost to employees around the

country; workers subject to both federal

and state rates are “entitled to the higher

of the two minimum wages,” according

to Cornell Law School.

While state action doesn’t deliver

the broad $15 hourly rate that many

have been fighting for, it gives the

power to state governments to assess

their own economies and make their

own informed decisions. It allows

states in need of an increase to do

so; at the same time, it will allow

states to opt out based on their own

circumstances. California sets a

good example for how states should

use their powers in response to

the unique conditions within their

borders; At the beginning of this year,

the Golden State raised its minimum

wage to $14 an hour.

One of the best arguments for this

approach is that the median cost of

living differs dramatically from state

to state, as does the current median

wage. Instead of painting with a

broad brush, giving authority to each

state acknowledges these economic

differences and promotes a stronger,

more detailed solution.

For instance, the average cost of

living in the state of Mississippi is over

15% lower than the national average.

By contrast, the state of California,

notorious for its high cost of living,

exceeds the national average cost

of living by almost 50%. Whereas a

minimum wage hike to $15 may do

more harm than good in Mississippi

and similar states due to their low

costs of living, it would be beneficial in

the entire state of California, not just a

select few areas.

Moreover, the living wage — the

amount necessary to live a comfortable

lifestyle — comes out to about $58,000

per year in Mississippi, compared

to nearly $100,000 in California. It

makes sense that workers in these

two states should make considerably

different amounts since their costs

of living are quite different. Why

should these two states have the same

minimum wage?

Without a doubt, the current

minimum wage from 2009 is

inappropriate for certain parts of

our country. Particularly for those

with skyrocketing living costs,

this glaring problem needs to be

addressed. But rather than institute

a national measure, something that

the Senate has already tried and

rejected, states should revise their

own minimum wages in response to

their unique and evolving needs. In

the end, delegating the power to each

state will pave the way for a solution

that acknowledges our differences

while supporting America’s workers

at every turn.

12 — Wednesday, March 17, 2021
Opinion
The Michigan Daily — michigandaily.com

EVAN STERN | COLUMNIST
KATHERINE KIESSLING
| COLUMNMIST

Katherine Kiessling can be reached

at katkiess@umich.edu.

Normalize discussing the reality of family dynamics
Minimum wage hikes should be left to the states

MEERA KUMAR | COLUMNIST

W

henever people ask, “How

are you?” I reflexively

respond with, “Good!

How ‘bout you?” I never take time to think

about how I’m truly doing. Generally,

asking someone how they are is simply an

easy greeting — often, people don’t care

for a genuine answer. So, we tell everyone,

including ourselves, that we’re good.

Even though we preface nearly every

conversation with a “How’s it going?” —

nobody seems to pay attention to their

state of being. “Good,” we say. “Fine.” Our

reflex is to lie without searching for an

answer.

When I ask how you are, I want your

honest answer. Really, truly, how are you?

Because, in all honesty, I’m terribly, totally,

utterly burnt out.

These days, so are most people I know.

Every day, when I call my best friend

Nandini, we describe our plans for the

rest of the day. Pretty much daily, we

work from when we rise to when we go

to sleep. Often 18 hours later, both of us

accidentally fall asleep on FaceTime calls.

Nearly every student I’ve spoken to at the

University of Michigan has mentioned

their exhaustion with the overwhelming

workload of online school. The entire

world seems to be weighed down with

Zoom fatigue.

Yet, we continue to chip away at

our seemingly endless pile of work. We

convince ourselves that we need to be a

certain amount of “productive” each day.

Otherwise, the day is a waste. In our quest

to simulate a seemingly “normal” virtual

learning experience, we sacrifice our

mental health. In our haze, we spend all

of our time in a single cramped place, lost

in the small details and unable to see the

big picture.

Nowadays, our contact with the

outside world is limited. The only time

we get asked how we are is when a friend

or professor asks at the beginning of a

conversation or class and it can feel nice to

know that someone cares about us, even

in the form of a trivial greeting. We search

high and low for empathy in any location

we can find. Without even realizing it, I’ve

relied on others to ask me how I am.

My friends and I, often concerned for

each other’s mental health, advise each

other to take a nap, and go for a walk.

However, we seldom give the same advice

to ourselves. Why are we so incapable of

taking care of ourselves in the way we

take care of others?

The noted Australian psychologist

Godfrey Barrett-Lennard has an answer:

we lack empathy toward ourselves.

Due to our emotional involvement with

our problems, it can be challenging

to understand what’s really going on.

Self-empathy, a valuable skill, allows us

to zoom out and view a situation with

impartiality. We can figure out how we’re

doing, and if we’re not doing well, we can

find necessary solutions. Barrett-Lennard

says the goal of therapy is to gain more

empathy for oneself and others.

It would be easy to dismiss this concept

as part of the recent commercialized,

extremely-hyped self-care movement.

However, it’s important to note that self-

empathy and self-compassion are not

the same things. Self-compassion (which

is also extremely important) involves

showing love and kindness to oneself. In

contrast, self-empathy involves simply

observing the patterns of emotions we

experience — it sounds like a no-brainer

but is actually incredibly hard to maintain.

Having the skill of self-empathy is

empowering. Instead of depending on

others for superficial care to (barely)

check-in with yourself, you’re able to give

yourself empathy and solve problems in

emotionally charged situations. When

you’re overwhelmed, instead of staring at

a screen for hours, take a break.

When you’re tired, instead of forcing

yourself to work even longer, take a nap.

In our “hustle-obsessed” culture, forcing

yourself to struggle is seen as a positive

concept. But this has extremely harmful

effects. Instead of ignoring our feelings,

we should encourage people to be

conscious of them and deal with complex

emotions accordingly.

Commonly cited ways to get in touch

with oneself include therapy, meditation,

journaling and more. However, these are

easier said than done — it’s important to

remember that taking care of one’s mental

health can sometimes require active effort.

It’s necessary to ask yourself: How am I? If

you’re honest with yourself, there’s a good

chance you’re doing less than stellar. Think

about it. Talk it out. Write it down. Leave

yourself a voice memo on your phone. And

try not to hate yourself for your negative

feelings or your perceived inability to do

work. While figuring out one’s feelings,

it’s essential to lead with empathy and not

idealistic expectations.

If you’re burnt out, like most college

students right now, I implore you to

take a step back. In fact, I’m currently

encouraging you to do the bare minimum

of the work needed to get through the

week. Skip a class or two if you need to —

forcing yourself to go to class exhausted

won’t help your learning. When I mention

this to friends, they guiltily say, “I would,

but I’ve already been slacking off this

week…” — good for you for taking time

off for yourself! If you’re still stressed,

I’m begging you to take more time off

to recharge. To force yourself to be

productive while burnt-out is unrealistic.

When we step off the treadmill,

we feel incredibly guilty. However, it’s

important to ask yourself: Who is this

guilt benefitting? (Hint: it doesn’t look like

the guilt is good for you). Your insecurity

over how much work you’ve completed

that day has been perpetuated by

companies obsessed with “productivity”

to maximize profit — by creating a culture

where destroying yourself to please your

boss feels necessary. We must unlearn

this mindset. Stop expecting yourself

to operate at full capacity at all times,

especially during a pandemic — seriously,

show yourself some empathy.

In this age, we measure our self-worth

by how “productive” we were earlier

in the day. I urge you to rebel and see

yourself as worth more than a machine

that cranks out essay after essay. Distance

yourself from your work — you deserve a

break.

Meera Kumar can be reached at

kmeera@umich.edu.

Design by Man Lem Cheng

Seriously, show yourself some empathy

ALEX NOBEL | COLUMNIST
Preventing the next pandemic

T

hroughout human history, the

main determinant separating

humans from other species has

been our ability to adapt the environment

to address our needs. Whether harnessing

fire to stay warm in cold temperatures,

inventing the wheel to transport things

long distances or utilizing the earth’s

magnetic field to guide navigation,

humans have made advancements that

make it easier for us to live. But what

happens when we overstep and go too far?

Since the 1980s there has been a “steep

rise in the number of outbreaks globally.”

A Brown University study found that

between the years 1980 and 2014, there

were more than 12,000 infection outbreaks

affecting 44 million people around the

world. In order to stop disease outbreaks

from becoming more and more common,

human society needs to drastically rethink

how it interacts with nature.

One of the forces driving the increasing

number of infectious outbreaks is how

land is used. It is estimated that humans

have changed 75% of all land globally.

What that looks like depends on the

region. It could mean urbanization and

suburban sprawl or deforestation and

mining, all of which make outbreaks

more likely. The Centers for Disease

Control and Prevention estimate that

three out of every four new diseases begin

with animals. These are called zoonotic

diseases — scientists believe there are

more than 1.7 million undiscovered

zoonotic viruses, of which roughly half

are predicted to be able to infect humans.

Changes to the land bring humans and

animals closer together. Deforestation and

other land-use changes are responsible

for about one-third of new diseases. As

habitats are destroyed for thousands of

species, they are forced to migrate into

new ecosystems, potentially disrupting

the food chain or being exposed to new

toxins. Eventually, they come in contact

with humans, increasing the risk of a

zoonotic disease outbreak amongst people.

An example of this is the origins of the 2014

Ebola outbreak in Guinea. One of the first

people to contract Ebola was a little boy

playing under a tree where a large number

of bats had begun to live. Bats do not like

living near humans — they had been forced

out of their homes by forest clearing and

mining done by foreign companies.

Urbanization contributes to the spread

of disease by concentrating 4 billion people

globally into small areas, sometimes living

in very unclean conditions. As the world

becomes increasingly urban and more

people move to cities, it presents the

perfect opportunity for diseases to reach

the level of an outbreak.

Suburban areas pose a different

threat to public health. Since the 1970s,

Lyme disease has affected the Northeast

United States. Lyme disease is caused

by ticks feeding on white-footed mice

where they become infected and then

transfer the bacteria to humans. When

people started moving out of cities and the

suburbs began to expand, formerly forest

and agricultural land was repurposed.

This disrupted the ecosystem and

harmed predators of the white-footed

mouse, allowing the mice to grow their

population and thereby increasing the

frequency of Lyme disease.

Another contributor to the increase of

infectious diseases is climate change. Like

many things, climate change seems to just

exacerbate existing problems, making

them more harmful and harder to solve.

As climate change leads to hotter and

wetter climates, the spread of infectious

diseases becomes easier. Infections that

are transmitted through water, food,

mosquitos and ticks are transmitted more

easily in warmer and wetter climates.

Studies have found that diseases such

as dengue, malaria and cholera have

already become more contagious with the

changing climate. Warmer temperatures

also are forcing thousands of species to

migrate from areas they had lived in for

centuries. The migration of these species

brings them closer to humans, which

again contributes to the spread of disease.

While these changes are scary, the

good thing is that there are actions

humans can take. One major step is to

stop deforestation. This would help keep

many habitats intact, and in turn prevent

species from being forced to migrate.

Additionally, millions of people around the

world call forests their home. Forests also

help to reduce carbon in the atmosphere

— absorbing more carbon than the U.S.

emits every year — which helps mitigate

climate change and therefore helps limit

the spread of infectious diseases.

Another thing that humans can do

is to adopt the One Health approach to

governing. This plan looks at humans,

animals, plants and their common

environment holistically. When using

a lens that assumes we are all part of

nature, we are all unhealthy when one

part is unhealthy. By designing policies

and programs that place the health

of everything into consideration, it

will result in a healthier public. This

approach, comprised of thorough policy

proposals rooted in research, can be used

in environmental regulations to ensure

air and water quality are not making

anyone sick.

Urban planning can also use this

to design cities in a way that will best

decrease opportunities for diseases to

spread. Lastly, the One Health approach

can be used for epidemiology. Unhealthy

environments can cause asthma or other

respiratory issues, and sick animals

have caused Ebola, Lyme disease, Zika

and plenty of other infectious diseases.

Looking at public health through a holistic

lens will allow us to identify and address

many of the factors that contribute to

human illness and sickness.

Something important to mention

about outbreak prevention is that it

looks different everywhere. A region’s

development, population, culture, climate

and countless other factors will influence

how people decide to approach fending

off disease outbreaks. In Thailand, for

example, this took the form of an app.

Residents use the app to send pictures of

any animals or plants that look suspicious

to public health officials and scientists

who are then made aware of diseases and

can research or contain them before they

become outbreaks.

Officials estimate that preventing

the next pandemic will cost around

$22 billion per year. While this is a

large amount, it is irresponsible not to

consider, given that the economic costs of

COVID-19 are predicted to be upwards

of $10 trillion. Investing in pandemic

prevention is not only the smart thing

to do because it will save lives, but it

is the economically smart decision

too. COVID-19 has shown us that we

must put an emphasis on public health

and need to be adequately funded and

address infectious disease prevention.

We cannot forget that we live within

nature and are not immune to nature’s

consequences.

Design by Mellisa Lee

Evan Stern can be reached at

erstern@umich.edu.

Alex Nobel can be reached at

anobel@umich.edu.

Back to Top

© 2025 Regents of the University of Michigan