Facebook's Liz Heron answered for a litany of perceived sins and slights last week during a conversation with The Atlantic's Alexis Madrigal and attendees at the Online News Association conference in Chicago. Journalists are anxious about being left out of the loop about how Facebook works, and they want answers.

Does Facebook play favorites in the News Feed Algorithm? Nope, according to Heron, the company's head of news partnerships:

Does Facebook decide what news you see? No, your past behavior does, because the News Feed is very personalized:

What's the best way to get my posts seen on Facebook? Are there any tricks I should know? Do I need to write "congratulations" in my post to get it boosted?

Nah, just post good content and everything will be fine:

In other words: If you want to be successful on Facebook, don't get caught up in the nuts and bolts of what it favors or disfavors about posts (and it won't tell you much about those nuts and bolts anyway, so that works out). Madrigal said to applause that Facebook could put an end to rampant speculation about the News Feed by just telling us how it works, but Heron said that would make it too easy to game the system.

Ultimately, Facebook says, it just wants you to post great content so it can surface it for its users, and "greatness" is determined by various measures of how users interact with the content. It's a process that requires lots of honing, and Facebook wants you to believe the algorithm is an earnest effort to give users what they value most.

The problem: What if the posts that users value most aren't always what news organizations value most?

Existential fear of the algorithm

Facebook is in the business of serving individual users, Heron said, and the algorithm is so nuanced that specific gaming strategies — like adding "congratulations" to a post — don't make sense. But there's so much at stake here for news organizations that it's difficult for them to accept Facebook's just-trust-us approach to its algorithm. Our traffic — our livelihood — depends on figuring this out.

Madrigal opened the session by talking about "dark Facebook," mobile traffic with no specific referral source but that he discovered through experimentation could mostly be attributed to Facebook's mobile app (which is how 80 percent of Facebook users access the service, Heron said). So you might be two to three times more dependent on Facebook than you think, Madrigal said.

At ONA, anxiety about Facebook's increasing control over our traffic revealed itself in lots of questions: If I have 250,000 fans of my page, why don't they all see everything I post? Why does my journalism seem to reach fewer people than it used to? Is Facebook trying to pressure my news organization to spend money to boost my posts or take out ads?

But there are more existential fears behind this conversation, too: If Facebook isn't interested in exposing users to content that might be important but won't result in high engagement like softer news and quizzes do, what will happen to news literacy? What will happen to civic engagement? What happens to The News That Matters, if only Facebook gets to decide what matters?

Facebook would say they're not really deciding what news matters — they're just revealing what news really resonates with each individual user. And that's how you end up with Ice Bucket Challenge videos dominating your News Feed instead of the latest information about Ferguson protests. Why should Facebook serve you vegetables when it knows you'd rather have the 8 Most Insanely Unhealthy Restaurant Meals In America?