Will kids ever forgive us for depriving them of their childhood? What we put them through has been ruinous for their mental health

By: Robert Bridge

Critics of lockdowns & school closures to halt Covid-19 have compared the effects to child abuse. And now that new data points to some deeply disturbing long-term psychological damage, it looks like they were right.

Abiding by the new age medical maxim that commands ‘everyone stop living so that you don’t die’ is no way to live. Yet that is exactly how millions of youngsters have been forced to cope with a disease that poses, in the overwhelming majority of cases, no more of a health risk to them than riding a bicycle or crossing an intersection.

And while socially isolating the youth may have spared a minuscule fraction from contracting coronavirus, the total impact such measures have had on the mental wellbeing of this demographic has been a disastrous tradeoff.

The results from the most inhumane experiment ever conducted on human beings are in, and we should all be ashamed of ourselves for letting it happen.

In a white paper published by the nonprofit FAIR Health, the consequences of lockdowns on the mental health of American students reveal what many people already know: “School closures, having to learn remotely and isolating from friends due to social distancing have been sources of stress and loneliness.” The real shocker, however, is how that statement plays out in real life. In March and April 2020, at the height of the Covid-19 pandemic, mental health claims among this young demographic exploded 97.0 percent and 103.5 percent, respectively, compared to the same months in 2019.

To break it down even further, there was a dramatic surge in cases involving “intentional self-harm” using a handgun, sharp object and even smashing a vehicle, as the more popular examples. The rate of incidence for such destructive behaviors amid 13-18 year olds jumped 90.71 percent in March 2020 compared to March 2019. The increase was even greater when comparing April 2020 to April 2019, almost doubling (99.83 percent). August 2020 was particularly active in the northeast sector of the country, showing a surge of 333.93 percent.

Similarly major increases were found among the 19-22 age category, although not quite as pronounced as the 13-18 group.

Another sign that young Americans have suffered undue psychological distress during the pandemic is observable from the rate of overdoses and substance abuse. For those between the ages of 13-18, overdoses increased 94.91 percent in March 2020 and 119.31 percent in April 2020 over the same periods the year before. Meanwhile, substance use disorders surged in March (64.64 percent) and April (62.69 percent) 2020, compared to 2019.

In one sample taken of the 6-12 age groups, increases in obsessive compulsive disorder shot up in March 2020 (up 26.8 percent) and persisted through November (6.7 percent). At the same time, nervous tic disorder increased some 28.7 percent by November. Another trend worth mentioning is that before the pandemic began, females in the 13-18 group accounted for 66 percent of total mental health claims; from March 2020 onward, the percentage increased to 71 percent in females compared to 29 percent in males.

The findings by FAIR are supported by other prominent studies, including one by the American Academy of Pediatrics, which found higher rates of suicide attempts in February, March, April, and July 2020 compared with the same months in 2019.

The unconscionable part of this tragedy is that children are known to be amazingly resilient to coronavirus. According to the European Centre for Disease Prevention and Control, the “majority of children do not develop symptoms when infected with the virus, or they develop a very mild form of the disease.” And among their peers at school, “outbreaks have not been a prominent feature in the COVID-19 pandemic.”

At the same time, scientific studies have proven that children are not Covid-19 “super spreaders,” which means that their teachers would be at low risk of infection. In other words, there is absolutely no reason that children should not be back in school, studying and socializing side-by-side their friends in a supportive, learning atmosphere.

Some places in the United States have begun to see the light. The Republican-run states of Arkansas, Florida, South Dakota and, most recently, Texas, encouraged by dropping infection rates and a nationwide push for vaccines, have fully reopened businesses and schools.

President Joe Biden, however, betrayed the severe political brinkmanship lurking behind Covid-19 when he slammed the decisions as “Neanderthal thinking.” In any case, while the gradual opening of America is a welcoming sign of much-needed sanity, it seems the damage has already been done as far as the mental condition of its youth are concerned. In fact, I find the consequences on par with that of the trauma experienced during war, and in some ways even worse. Not least that this was self-inflicted.

Covid-19, or rather our responses to it, have had all of the destructive force of a hydrogen bomb – albeit a silent one – dropped smack in the middle of our communities and sucking out the precious life. Now entire families are forced to ‘shelter in place’ from an enemy they cannot see, while businesses, schools and even churches – the essential meeting places that give people hope and strength – have been forced to close their doors.

Children have been taught to look at each other warily, like walking chemical factories capable of infecting and even killing, as opposed to fellow human beings that can provide love, comfort and support. It is my opinion here that the medical authorities who imposed this protracted lockdown on the youth have forfeited the right to practice medicine ever again –  and a similar fate should await the politicians who sanctioned it.

Let’s be clear. We are not talking about the Black Plague of the 14th century, where entire towns were wiped out and bodies piled up in the streets as people fled to the remote villages and countryside to escape certain death. Not by a long shot. Yes, it is important to take precautions against this virus, but catching Covid is not a death sentence; an estimated 99.75 percent of those infected can expect to fully recover, while the incidences of children dying from coronavirus are exceedingly rare.

In the overwhelming majority of cases, those who do succumb to Covid are the elderly who had already been weakened with “comorbidities.” While every death is regrettable, the sort of fatalities we are dealing with do not justify the lockdown of Main Street, to say nothing about businesses, churches and schools. It would have been far more humane had the elderly and sick been singled out for special protection, while the rest of the world got on with the business of living.

Instead, we did the most unconscionable thing imaginable, forcing young children – at the most momentous times of their lives – to adhere to social distancing rules while shutting down their schools and imprisoning them in their homes. That is simply cruel and unusual punishment. In a word, it is child abuse. We failed to heed the warning about where that allegorical road paved in “good intentions” may lead us, and that is exactly where millions of children now find themselves. Trapped in a mental hell of the adult world’s making. I pray that, one day, they forgive us.

Source and Image: https://www.rt.com/op-ed/517823-kids-forgive-covid-lockdown/

Comparte este contenido:

American university students are coddled, thin-skinned snowflakes, and social media is to blame

By: Robert Bridge

The explosion of ‘cancel culture’ and the social justice mindset on college campuses across the US was inspired by social media, where the idea of creating digital ‘safe spaces’ without ‘trolls’ has invaded the real world.

For those born around 1995, this column will likely be filed away under the heading: ‘Aging Generation X-er with No Clue Rails against Evils of Social Media.’ And I suppose there may be some truth to that claim. After all, the greater part of my life – like that of many other people – was spent without access to handheld technologies and the endless apps, add-ons and what-nots. The reason is not because I lived on an island, or was born among the Amish, but because such technologies were not around in my time. In other words, the youth of Generation X was more defined by Alexander Graham Bell than Steve Jobs.

Today, the ‘reality’ for those born after 1995 – the so-called ‘Generation Z’ – is radically different from those born just a decade earlier, since they have had an intimate relationship with the Internet practically since birth. It would be naïve to think this age demographic – many of whom were nurtured on social media – would reach adulthood with the same set of attitudes, values, and worldview as their predecessors. What’s shocking is just how different they really are.

Starting in 2014, just as Generation Z was entering college, a strange new phenomenon began surfacing on campuses across the country. Students, who are traditionally the staunchest defenders of free thought and the least likely to be prudes, began tossing around vague concepts carried over from the internet, such as ‘safe spaces,’ ‘microaggressions,’ and ‘getting triggered.’

A 2014 article in The New Republic shed an early light on this encroaching mentality: “What began as a way of moderating internet forums for the vulnerable and mentally ill now threatens to define public discussion both online and off,”wrote Jenny Jarvie. “The trigger … signals not only the growing precautionary approach to words and ideas in the university, but a wider cultural hypersensitivity to harm and paranoia about giving offense.”

But instead of adjusting their sails for the approaching tsunami of tears, universities broke with a thousand-year-old academic tradition, allowing the feelings and emotions of misguided adolescents to supersede the wisdom and reasoning of the educators. In fact, the world of academia not only failed to stop the flood, but, due to its own extreme liberal bent, helped to aggravate the strife by blaming the perceived ills of the world on some select bogeymen. More often than not these were dead white guys, members of a clan known as ‘the patriarchy’ that thrives today on its so-called ‘white privilege.’ Thus, college campuses are now riddled with angst and activism to the point that even the rules of English grammar and mathematics have become suspect.

Perhaps the greatest casualty from this radical makeover, however, is the trust that had been cultivated over the centuries between student and teacher. Professors today are hypersensitive to the grim fact that they may lose their job for doing or saying something ‘offensive’ that violates the rules of politically correctness. At the same time, many colleges are now extremely hesitant about inviting controversial speakers to their campus for fear of ‘triggering’ their students and inciting protests.

The intellectual bubble that now encapsulates the college campus mirrors the reality on social media, where users have a strong tendency to mingle with only those individuals who share their worldview. Whenever some annoying outsider with a different opinion attempts to ‘troll’ them, canceling that person and their alternative views is as easy as ‘unfriending’ them. Meanwhile, there is a certain status and feeling of moral superiority that comes from ‘canceling’ some heretic that has fallen afoul of political correctness.

In the 2018 book ‘The Coddling of the American Mind’, Greg Lukianoff, the president of the Foundation for Individual Rights in Education, and Jonathan Haidt, a social psychologist, argue that the digital constructs of ‘safe spaces’ have done far more harm than good.

“Social media has channeled partisan passions into the creation of a “callout culture,” Lukianoff and Haidt argue. “New-media platforms and outlets allow citizens to retreat into self-confirmatory bubbles, where their worst fears about the evils of the other side can be … amplified by extremists and cyber trolls intent on sowing discord and division.”

According to Lukianoff and Haidt, Generation Z’s fierce aversion to controversial and even shocking information means that college campuses have become “more ideologically uniform,” thereby hindering the ability of “scholars to seek truth, and of students to learn from a broad range of thinkers” as historically has been the case at university.

The problem with allowing cancel culture to take root on social media and the university in the first place is that American society is now confronted with a mammoth weed on its front lawn. And while most people agree it is a problem, at the very least an eyesore, those who propose solutions risk being canceled themselves.

Last month, for example, 150 public figures, including Noam Chomsky, Salman Rushdie and JK Rowling attracted anger and ridicule after they signed a letter that called out ‘cancel culture.’ In part, the letter warned that the “restriction of debate, whether by a repressive government or an intolerant society, invariably hurts those who lack power and makes everyone less capable of democratic participation.”

Not only were these left-leaning signatories extremely late to the game, they themselves have been accused of attempting to silence voices, mostly conservative ones, they did not agree with. Others, like Jennifer Finney Boylan, actually apologized to the mob for endorsing the milquetoast proposals put forward in the letter.

The tragic irony is that Western civilization, which was constructed on the free flow of ideas, is no longer capable of even pointing out problems without attracting scorn and derision. Such a repressive atmosphere, endorsed by ideologues that listen only to the voices inside their own heads, is severely threatening future progress. If this dangerous new tendency is not confronted head on and brought under control, it will be Western civilization itself that eventually finds itself ‘canceled’ due to its inability to evolve.

Source and Image: https://www.rt.com/op-ed/496957-us-university-social-media/

Comparte este contenido:

Get out of my face! Facial recognition technology could enslave mankind like never before

By: Robert Bridge

Advertised as the latest tool to give shoppers more convenience, facial recognition comes with deep costs to privacy and security. By the way, can anyone remember Silicon Valley asking for permission to use your face?

Mankind has long feared that some totalitarian state, as vividly described by visionary writers like George Orwell (1984), Aldous Huxley (Brave New World), and Yevgeny Zamyatin (We), will ultimately arise and enslave him in an inescapable technological dystopia.

However, it is not usually the technology, an inherently neutral force, which men fear most; the deep distrust is directed at the shadowy individuals behind the curtain who may be tempted to use their tinkering prowess for ulterior motives, like crushing human freedom underfoot.

Consider, for example, how futurists warned of the day when consumers would voluntarily line up for the pleasure of being ‘microchipped’ so as to more efficiently access the ‘matrix’ with a magical wave of the hand. Well, that drop of derangement has already seen the light of day. The technology, injected under the skin, was thought to be the end game, the so-called ‘mark of the beast’ according to some apocalyptic critics, as far as personal freedom is concerned. Unless human beings submitted to being electronically chipped, the doomsayers say, they would be barred from engaging in vital social activities, including shopping, banking or using the Internet. In effect a death sentence.

Today, however, with radical advances being made in the field of facial recognition technologies, it looks as though the promising chip has met its match.

In a recent article by Market Watch, a new ‘frictionless’ consumer dawn is on the horizon where cumbersome accessories like wallets and purses, together with the outdated cash and credit cards they hold, will be replaced by a payment scheme known as the “biometric mobile wallet.” Sounds like the ultimate gift this holiday season, right? Well, think again. First of all, the name of the technology is very misleading since there is no leather billfold to wrap up and place under the Christmas tree. That’s because the system works off an individual’s distinctive bodily features, face, fingerprints and retinas. In other words, the ultimate ‘face control.’

As to be expected, the article heaps boundless praise on the technology, which is on the verge of going live. Soon, harried shoppers will no longer have to fumble around in their purses to find their credit cards. Just stare blankly into the “in-store facial recognition machines” and you’re on your way. In addition to that small convenience it provides the consumer, it also has the added ‘advantage’ of making people spend more money, since the ‘frictionless’ transaction gives the illusion, and a potentially dangerous one at that considering the US consumer’s outstanding debt burden, that no dirty money has traded hands.

Still, something doesn’t feel right. Perhaps it has to do with the summary of the article, which says that the deployment of facial recognition will remove “the last physical barrier between our bodies and Corporate America.” I felt the urge to take a very hot shower after reading that line. And later in the same article, the creep factor went into overdrive with a similar quote by Aram Sinnreich, associate professor of communication studies at American University.

Every technological necessity exists in the real world and is used commercially,” Sinnreich said matter-of-factly. “It’s the neoliberal takeover of the human body.

Yet, another loaded comment, and one that screams ‘enslavement’ minus the unfashionable chains of yesteryear. The question is, who will exactly benefit from this so-called “technological necessity” and to what end? The only real benefits that I can see from facial recognition, at least from the consumers’ perspective, are that people no longer have to worry about losing their wallets, or wasting an extra 30 seconds using their credit cards.

But do those tiny advantages outweigh massive concerns over ‘identity theft,’ for example? After all, while it remains relatively easy to cancel a stolen credit card, how exactly does one cancel their facial features? Moreover, what if my own personal views clash with those of the “neoliberals” who, as the headline of the article openly admits, own everyone’s facial features? Will my ability to buy food, access my smart phone and book a flight be impeded by the Silicon Valley overlords? Who will stop them?

To get an idea where the future of facial recognition could be heading, one need only consider China, which is in the process of rolling out its so-called ‘Social Credit System,’ a fusion between ‘Big Data’ and ‘Big Brother’ that ranks its citizens on everything from their finances, to their social media behavior, to the books they are reading. Falling afoul of the system could have harsh consequences, like being denied the ability to purchase airline tickets or even getting a job. Facial recognition will play no small part in the development of this all-encompassing matrix that relies upon some 200 million surveillance cameras, and let’s face it, if the Chinese can find a way to electronically monitor their 1.3 billion people, then anyone can. After all, the same technology that identifies the lonely face in the crowd is the same one that allows users of Apple’s iPhone to access “Face ID” to unlock their phones.

Meanwhile, the Western world is gradually catching up to Chinese levels of mass surveillance. Of the top 10 cities in the world with the highest number of CCTV cameras, eight are located in China. However, the United States and the UK also ranked, with London taking sixth place, followed by Atlanta, Georgia grabbing the tenth spot.

Meanwhile, new facial recognition applications continue to expand exponentially. For example, computers are now able to measure the emotional state of motorists just by accessing their facial image. Will drivers be fined for ‘road rage’ even before an outburst occurs? Is this the sort of controlled world we want to inhabit where our identities and emotional states are tracked everywhere we go? Whatever the case may be, one thing is certain, IT companies have no intention of holding a referendum to determine how their users feel about this technology.

In a 2018 paper entitled, ‘The Data of You: Regulating Private Industry’s Collection of Biometric Information,’ attorney Hannah Zimmerman admits there is “no generally applicable federal law that regulates the private sector’s collection and use of biometric information in the US.” Given the upsurge in facial recognition implementation that is a worrying disclosure.

Zimmerman goes on to warn that businesses “already track consumers’ every move online for advertising and behavioral analysis purposes,” while the introduction of facial recognition would let them “track us in the real world.” Again, we are left to ponder the question: is this a desirable condition for human beings?

While the implications that arise from such technology are enormous, and not all necessarily negative, it stands to reason that safeguards must be established to ensure that people do not wake up one day to find themselves enslaved by the invisible chains of this new technology, which will only serve mankind’s best interests so long as its owners strive for that to happen. Thus far, their true intentions are not so obvious, and that unpredictability should be a source of concern to everyone.

Information Reference: https://www.rt.com/op-ed/468603-facial-recognition-shopping-mass-surveillance/

Comparte este contenido: