Connect with us

Hi, what are you looking for?

DOGE0.070.84%SOL19.370.72%USDC1.000.01%BNB287.900.44%AVAX15.990.06%XLM0.080.37%
USDT1.000%XRP0.392.6%BCH121.000.75%DOT5.710.16%ADA0.320.37%LTC85.290.38%
THE BIZNOB – Global Business & Financial News – A Business Journal – Focus On Business Leaders, Technology – Enterpeneurship – Finance – Economy – Politics & LifestyleTHE BIZNOB – Global Business & Financial News – A Business Journal – Focus On Business Leaders, Technology – Enterpeneurship – Finance – Economy – Politics & Lifestyle

Technology

Technology

Facebook’s Mood Experiment Raises Concern

When people wake up in the morning, they expect to be in control of their own happiness throughout the day. It’s a part of life; it’s a freedom that each person is given. Well, news broke that Facebook recently manipulated this liberty, and many people are upset with the invasion of personal sentiments.

According to CNN, Facebook altered the content that 690,000 viewers were shown in order to determine if either negative or positive messages had an impact on their mood. The research findings from Cornell, the University of California, San Francisco and Facebook were just released in Proceedings of the National Academy of Science.

The findings were significant and disclosed that users who were prompted with negative messages on their newsfeed were also more likely to create posts with negative connotations. The same was denoted for users who saw positive messages on their newsfeed and generated uplifting posts.

“The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused,” Adam Kramer said in a recent post.

Many people have been rubbed the wrong way with this study, but few realize they did give their consent—it all hides within the “Agree” button. So who is to blame?

Susan Fiske, editor of the research, told The Atlantic in a phone interview, “I was concerned until I queried the authors, and they said their local institutional review board had approved it… on the grounds that Facebook apparently manipulates people’s News Feeds all the time… I understand why people have concerns. I think their beef is with Facebook, really, not the research.”

The researchers did have procedural approval, so there is no room to point fingers there. However, the public is fighting for justified, ethical consideration from the company. What is legal is not always fair.

 

 

 

 

 

 

 

 

 

Photo: Screenshot


Comment Template

You May Also Like

Technology

A corporate spokesman told a parliamentary committee on Friday that Facebook owner Meta may restrict Australian news material if the government requires licensing payments....

Technology

Previous administrations failed to unify millions of Kenyans across ethnicity and party as President William Ruto has. On Tuesday, hundreds of trainer-wearing activists blocked...

Technology

A proposed EUCS cloud cybersecurity certification program should not discriminate against Amazon and Alphabet. Microsoft, Google Monday saw 26 European industrial groups warn. After...

Technology

Meta Platforms has decided to temporarily halt its Meta AI models in Europe after the recommendation of the Irish Privacy Commission to delay the...

Notice: The Biznob uses cookies to provide necessary website functionality, improve your experience and analyze our traffic. By using our website, you agree to our Privacy Policy and our Cookie Policy.

Ok