I know, I know.
This is a huge news story.
But if you’ve been reading this site, you’re already aware of the “fake news” problem.
It’s the problem of people who use the internet to make stuff up, distort reality, and spread misinformation about medical care.
And in recent weeks, the FDA has been cracking down on some of these people.
The fake news issue has been well documented over the years, and it’s been exacerbated by the fact that fake news sites are increasingly popular.
The FDA recently launched a new program to combat fake news, and the agency’s goal is to stop people from spreading misleading, defamatory, and/or misleading information about medical services.
And this isn’t the first time that the FDA is cracking down.
Back in December, the agency released a list of fake news stories that would have to be removed from the internet by March 5.
These rules have been criticized by medical professionals and consumer groups, but they’re part of a broader effort to protect consumers from misinformation.
There’s no doubt that the internet has helped push a new wave of misinformation.
It’s no surprise that so many people are willing to share their personal experiences and medical information online, and that a lot of the time, it’s all made up.
It’s easy to make up stuff on the internet, and people have a lot more power than they used to.
But the reality is that, as a consumer, you don’t always know what you’re buying.
The internet is a marketplace, and companies like Facebook and Twitter have all but become the gatekeepers to the information we consume.
So to protect the public from misinformation, the FTC is stepping up to the plate.
On January 17, the company published an op-ed titled “The ‘Fake news’ crisis has been stoked by the internet.”
It was authored by FTC Commissioner Terri Burke, who told reporters that she thinks the internet’s growth as a source of information is a “dangerous” one, as it encourages the spread of misinformation, which leads to fewer people seeking treatment.
Burke called on social media platforms like Twitter, Facebook, and Reddit to be more vigilant when it comes to the dissemination of misinformation: “In the digital era, it is not a coincidence that misinformation is on the rise,” she said.
“People who spread misinformation should be held accountable, and if we can do so, we can also reduce the risk that people will choose alternative health care services or rely on health care providers that they have not seen before,” she added.
In fact, Burke said that she’s looking into how to better combat misinformation on social platforms.
She also said that it’s important for healthcare providers to have more information on patients and their needs, and not just what’s in the papers.
But not everyone is convinced that this is the best way to fight fake news.
Earlier this month, a former employee of the healthcare giant Medtronic told the Wall Street Journal that he was fired for sharing fake medical information on Facebook.
That story was picked up by CNN, the New York Times, and other media outlets, including the Associated Press.
One patient, who wished to remain anonymous, told the New Yorker that Medtronics did not have any oversight of the content on the company’s social media channels.
While the FDA says it’s investigating fake news on social networks, the reality of fake-news websites is a scary one for patients and healthcare professionals.
Some people feel like they’re constantly under attack by those who are using the internet as a weapon against them.
Many of us have seen that the truth has a way of outflanking the lie.
The truth is often more difficult to discern than the lies that circulate online.