Not by Links Alone

At this unthinkably late hour, many of even the most recalcitrant journalists and newsy curmudgeons have given themselves over, painfully, to the fundamentally important fact that the economics of abundance now govern their world.

For many, of course, stemming that tide is still paramount. Their goal, as David Carr writes, is to squelch the “new competition for ads and minds.” Thus Walter Isaacson’s “E-ZPass digital wallet” and Alan Mutter’s “Original Sin.” Thus Michael Moran’s obnoxious “NOPEC.” Thus Journalism Online. And, of course, thus we have David Simon’s recent call for Congress to “consider relaxing certain anti-trust prohibitions” or this call in the Washington Post to rework fair use. I wish them all good luck, but mostly good night.

There are others, though, who think it’s great that the Internet and Google are opening up the news to competition. In fact, “Google is good” strikes me as nearly orthodox among the basically Internet-savvy set of news talkers. Marissa Mayer crows about how Google delivers newspapers’ Web sites one billion clicks a month, and Arianna Huffington insists that the future of news is to be found in a “linked economy” and “search engines” like Google.

In this narrative, Google’s the great leveler, ushering the world of journalism out of the dark, dank ages of monopoly and into the light, bright days of competition, where all news articles and blog posts stand on their own pagerank before the multitude of users who judge with their links and their clicks. Its ablest defender is probably Jeff Jarvis, author of What Would Google Do? Jarvis was relatively early in pointing out that “Google commodifies the world’s content by making it all available on a level playing field in its search.” In that and other posts at Buzz Machine, his widely read blog, Jarvis allows that Google “can make life difficult” but insists, “that’s not Google’s fault.” The reverence for Google is thick: “The smart guys are hiring search-engine optimization experts and trying to figure out how to get more people to their stuff thanks to Google.”

But defenders of Google’s influence on the broader market for news and newspapers themselves make a striking error in believing that the market for content is competitive. That belief is wrong—not just a little bit or on the margin, but fundamentally, and importantly, wrong.

Which is not to say that news publishers aren’t competing for readers’ eyeballs and attention. Publishers compete with one another all day long, every day—with some local exceptions, the news has always been competitive like a race, and is now more competitive like a market than ever before. But the market for that news—the place where consumers decide what to read, paying with their attention—is not competitive. Google may well be the great leveler, but down to how low a field?

To be very clear, this is far from a neo-classical purist’s critique that picks nits by abusing uselessly theoretical definitions. I am not a purist, an economist, or a jerk. This is reality, as best as I know it. Nevertheless, to say that the market for content is competitive is just to misunderstand what a competitive market actually entails. The market for news content as it currently stands, with Google in the middle, is a profoundly blurry, deeply uncompetitive space.

*    *    *

“The difficulty of distinguishing good quality from bad is inherent in the business world,” Nobel laureate George Akerlof wrote in the kicker of his most famous paper, published in 1970. “This may indeed explain many economic institutions and may in fact be one of the more important aspects of uncertainty.”

Akerlof fired an early shot in a scholarly marathon to study the effects of asymmetric information in markets. What do parties to a potential transaction do when they know different sets of facts? Maybe that seems like an obvious question, but economists in the middle of the twentieth century had been pretty busy worrying about perfecting complicated models despite their grossly simplistic assumptions.

So Akerlof set about to write about how markets can fail when some of those assumptions turn out to be bunk. The assumption he tested first, in “The Market for ‘Lemons,'” was certainty, and he showed that when sellers know more about the goods being sold than the buyers do, sellers abuse their privileged position and buyers leave the market.

Writing in the same year, the economist Phillip Nelson studied the differences between what he called “search goods” and “experience goods.” Search goods and experience goods express a certain kind of asymmetry. For search goods, consumers can overcome the asymmetry before the point of purchase by doing their homework, while for experience goods, consumers must take their time and invest.

A pair of pants, for instance, is a search good—you can try before you buy, and shop around for the pants that fit you best. An apple, on the other hand, is an experience good—you don’t know whether you’ll like one until you consume it, and you can’t really try before you buy.

News articles are experience goods. Just as with an apple, you need to consume the story, reading the article or watching the video or so on, in order to judge its quality. “Stories can vary in length, accuracy, style of presentation, and focus,” writes economist James Hamilton in All the News That’s Fit to Sell. “For a given day’s events, widely divergent news products are offered to answer the questions of who, what, where, when, and why.” We can’t know which one’s best till we’ve read them all, and who’s got time for that?

Moreover, a multitude of subjective editorial decisions produce the news. Each reporter’s practices and habits influence what’s news and what’s not. Their learned methods, their assigned beats, and even their inverted pyramids shape what we read and how. Reporters’ and editors’ tastes, their histories, or their cultures matter, as do their professional ethics. Each article of news is a nuanced human document—situated aesthetically, historically, culturally, and ethically.

Ultimately, the news is afflicted with the problem of being an experience good more than even apples are. At least Granny Smiths don’t vary wildly from farmer to farmer or from produce bin to produce bin. Sure, some may be organic, while others are conventional. One may be tarter or crispier than another, but tremendous differences from the mean are very unlikely. With the news, though, it’s hard even to think of what the mean might be. It may seem obvious, but articles, essays, and reports are complex products of complex writerly psychologies.

For a long time, however, as readers, we were unaware of these nuances of production. That was, in some sense, the upshot: our experience of this journalism was relatively uncomplicated. This profound lack of context mattered much less.

Call it the myth of objectivity maybe, but what NYU professor Jay Rosen has labeled the “mask of professional distance” meant that we didn’t have much of a chance to bother with a whole world complexities. Because everyone usually wore a mask, and because everyone’s masked looked about the same, we ignored—indeed, we were largely necessarily ignorant of—all the unique faces.

For a long time, therefore, the orthodox goal of American newspapers virtually everywhere was news that really wasn’t an experience good. When news existed only on paper, it hardly mattered what news was, because we had so few seemingly monochrome choices about what to read. We returned to the same newspapers and reporters behind the same masks over and over again, and through that repetition, we came subtly to understand the meaning and implications of their limited degrees of “length, accuracy, style of presentation, and focus.”

As a result, we often grew to love our newspaper—or to love to hate it. But even if we didn’t like our newspaper, it was ours, and we accepted it, surrendering our affection either way, even begrudgingly. The world of news was just much simpler, a more homogeneous, predictable place—there were fewer thorny questions, fewer observable choices. There was less risk by design. Our news was simpler, or it seemed to be, and we had little choice but to become familiar with it anyhow. One benefit of the View from Nowhere, after all, is that basically everyone adopted it—that it basically became a standard, reducing risk.

But a funny thing happened in this cloistered world. Because it seemed only natural, we didn’t realize the accidental nature of the understanding and affection between readers and their newspapers. If, as the economists would have it, the cost of a thing is what we’ve sacrificed in order to achieve it, then our understanding and affection were free. We gave nothing up for them—for there was scarcely another alternative. As a result, both readers and publishers took those things for granted. This point is important because publishers are still taking those things for granted, assuming that all people of good faith still appreciate and love all the good things that a newspaper puts on offer.

*    *    *

But when our informational options explode, we can plainly, and sometimes painfully, see that our newspapers aren’t everything. Different newspapers are better at answering different questions, and some answers—some as simple as what we should talk about at work tomorrow—don’t come from newspapers at all. So we go hunting on the Internet. So we gather. So we Google.

We have now spent about a decade Googling. We have spent years indulging in information, and they have been wonderful years. We are overawed by our ability to answer questions online. Wikipedia has helped immensely in our efforts to answer those questions, but pagerank elevated even it. Newspapers compose just one kind of Web site to have plunged into the scrum of search engine optimization. Everyone’s hungry for links and clicks.

And Google represents the Internet at large for two reasons. For one, the engine largely structures our experience of the overall vehicle. More importantly, though, Google’s organization of the Internet changes the Internet itself. The Search Engine Marketing Professional Organization estimates, in this PDF report, that North American spending on organic SEO in 2008 was about $1.5 billion. But that number is surely just the tip of the iceberg. Google wields massive power over the shape and structure of the Internet’s general landscape of Web pages, Web applications, and the links among them. Virtually no one builds even a semi-serious Web site without considering whether it will be indexed optimally. For journalism, most of the time, the effects are either irrelevant or benign.

But think about Marissa Mayer’s Senate testimony about the “living story.” Newspaper Web sites, she said, “frequently publish several articles on the same topic, sometimes with identical or closely related content.” Because those similar pages share links from around the Web, neither one has the pagerank that a single one would have. Mayer would have news Web sites structure their content more like Wikipedia: “Consider how the authoritativeness of news articles might grow if an evolving story were published under a permanent, single URL as a living, changing, updating entity.”

Setting aside for the moment whatever merits Mayer’s idea might have, imagine the broader implications. She’s encouraging newspapers to change not just their marketing or distribution strategies but their journalism because Google doesn’t have an algorithm smart enough to determine that they should share the “authoritativeness.”

At Talking Points Memo, Josh Marshall’s style of following a story over a string of blog posts, poking and prodding an issue from multiple angles, publishing those posts in a stream, and letting the story grow incrementally, cumulatively might be disadvantaged because those posts are, naturally, found at different URLs. His posts would compete for pagerank.

And maybe it would be better for journalism if bloggers adopted the “living story” model of reporting. Maybe journalism schools should start teaching it. Or maybe not—maybe there is something important about what the structure of content means for context. The point here isn’t to offer substantive answer to this question, but rather to point out that Mayer seems unaware of the question in the first place. It’s natural that Mayer would think that what’s good for Google is good for Internet users at large. For most domestic Internet users, after all, Google, which serves about two-thirds of all searches, essentially is their homepage for news.

But most news articles, of course, simply aren’t like entries in an encyclopedia. An article of news—in both senses of the term—is substantially deeper than the facts it contains. An article of news, a human document, means substantially more to us than its literal words—or the pageranked bag of words that Google more or less regards it as.

Google can shine no small amount of light on whether we want to read an article of news. And, importantly, Google’s great at telling you when others have found an article of news to be valuable. But the tastes of anonymous crowds—of everyone—are not terribly good at determining whether we want to read some particular article of news, particularly situated, among all the very many alternatives, each particularly situated unto itself.

Maybe it all comes down to a battle between whether Google encourages “hit-and-run” visits or “qualified leads.” I don’t doubt that searchers from Google often stick around after they alight on a page. But I doubt they stick around sufficiently often. In that sense, I think Daniel Tunkelang is precisely correct: “Google’s approach to content aggregation and search encourages people to see news…through a very narrow lens in which it’s hard to tell things apart. The result is ultimately self-fulfilling: it becomes more important to publications to invest in search engine optimization than to create more valuable content.”

*    *    *

The future-of-news doomsayers are so often wrong. A lot of what they said at Kerry’s hearing was wrong. It’s woefully wrongheaded to call Google parasitic simply because it the Internet without it would be a distinctly worse place. There would be, I suspect, seriously fewer net pageviews for news. And so it’s easy to think that they’re wrong about everything—because it seems that they fundamentally misunderstand the Internet.

But they don’t hold a monopoly on misunderstanding. “When Google News lists one of ours stories in a prominent position,” writes Henry Blodget, “we don’t wail and moan about those sleazy thieves at Google. We shout, ‘Yeah, baby,’ and start high-fiving all around.” To Blodget, “Google is advertising our stories for free.”

But life is about alternatives. There’s what is, and there’s what could be. And sometimes what could be is better than what is—sometimes realistically so. So however misguided some news executives may have been or may still be about their paywalls and buyouts, they also sense that Google’s approach to the Web can’t reproduce the important connection the news once had with readers. Google just doesn’t fit layered, subtle, multi-dimensional products—experience goods—like articles of serious journalism. Because news is an experience good, we need really good recommendations about whether we’re going to enjoy it. And the Google-centered link economy just won’t do. It doesn’t add quite enough value. We need to know more about the news before we sink our time into reading it than pagerank can tell us. We need the news organized not by links alone.

What we need is a search experience that let’s us discover the news in ways that fit why we actually care about it. We need a search experience built around concretely identifiable sources and writers. We need a search experience built around our friends and, lest we dwell too snugly in our own comfort zones, other expert readers we trust. These are all people—and their reputations or degrees of authority matter to us in much the same ways.

We need a search experience built around beats and topics that are concrete—not hierarchical, but miscellaneous and semantically well defined. We need a search experience built around dates, events, and locations. We need a search experience that’s multi-faceted and persistent, a stream of news. Ultimately, we need a powerful, flexible search experience that merges automatization and human judgment—that is sensitive to the very particular and personal reasons we care about news in the first place.

The people at Senator Kerry’s hearing last week seemed either to want to dam the river and let nothing through or to whip its flow up into a tidal wave. But the real problem is that they’re both talking about the wrong river. News has changed its course, to be sure, so in most cases, dams are moot at best. At the same time, though, chasing links and clicks, with everyone pouring scarce resources into an arms race of pagerank while aggregators direct traffic and skim a few page views, isn’t sufficiently imaginative either.

UPDATE: This post originally slipped out the door before it was fully dressed. Embarrassing, yes. My apologies to those who read the original draft of this thing and were frustrated by the unfinished sentences and goofy notes to self, and my thanks to those who read it all it the same.

24 Responses to “Not by Links Alone”


  1. 1 Jeff Jarvis 2009 May 23 at 12:40 pm

    Wonderful piece, Josh. I respond at the automated, SEO-friendly, automated link above.

    • 2 Seth Wagoner 2009 May 25 at 12:50 am

      Jeff, I think you were referring to the “possibly related posts” section in your comment, which currently does not have a link to your response in it, but probably did when you wrote the comment. Thus illustrating one of the problems with automatically generated SEO-friendly stuff.

  2. 3 Ken Leebow 2009 May 23 at 1:02 pm

    Interesting perspective . . . I’m still waiting for someone in the “news” business to define what news is. With the economics of abundance, I call it infinity, I believe news is not what it used to be. And, the people in the “news” business do not realize it – Google is only one piece of the pie.

  3. 4 RickWaghorn 2009 May 23 at 1:09 pm

    Josh,

    ‘We need a search experience built around concretely identifiable sources and writers…’

    Think Martin Moore and some bloke called Sir Tim Berners-Lee are the one’s you need to talk; thinking the very same thoughts; and armed, of course, with a dollop of Knight money…

    http://outwithabang.rickwaghorn.co.uk/?p=231

    All the bets, etc

    Rick

  4. 5 Jeff Mignon 2009 May 23 at 2:05 pm

    Great piece Josh. Your point about filtering our news/info experience is key. Google does not do it (yet) properly. We need friends, experts, journalists, algorithms… and more to be name. Sometimes I wonder if finally FaceBook and others social meda sites are not in fact creating these new spaces for information. I wrote this piece about the need of filtering: My ideal media needs to be “crowdfiltered” and “crowdproduced” — http://mediacafe.blogspot.com/2009/01/my-ideal-media-needs-to-be.html

  5. 6 ryan 2009 May 23 at 8:11 pm

    Good read. Chunking articles into small digestable segments provides one very large advantage: the ability to target a wider array of search traffic due to the multiple TITLES. Additionally natural traffic is a very large ranking factor – links are slowly losing *some* of their mojo.

  6. 7 Jonathan Zuk 2009 May 23 at 9:52 pm

    There have been so many essays on the future of news gathering that have not discussed economic theory at all. It was awesome to read a piece that finally bridged that gap. Relating to your statement, “We need a search experience built around dates, events, and locations”, I’m wondering what do you think about the new search engine WolframAlpha – http://www.wolframalpha.com/ and what it (and similar semantic search technologies) could mean for journalism?

  7. 8 billbennettnz 2009 May 24 at 2:47 am

    My biggest fear is about what google and the economics online publishing does to the quality of news. Some major news sites, in this region of the world The Sydney Morning Herald would be a good example, are very tabloid. Celebrities, minor sexual titillation and violent crime are all pushed to the fore. Even new products get prominence. The big analytical news piece that explain what is going on in areas such as the economy are pushed down the agenda.

    This isn’t just a curmudgeonly whinge, my fear is that all news depends on the total news ecosystem. And this is being slashed and burnt faster than you can say tropical rain forest. Those important, authoritative pieces are the metaphorical equivalent of mangrove swamps. Not exciting, but an essential component of the whole.

  8. 9 Marah Marie 2009 May 24 at 4:46 am

    Marissa is wrong. Marissa is also brilliant. She [and Google, the company she represents] do not want online newspapers to succeed without them. Those one billion clicks they send the online papers every month don’t make them one billion dollars; until they do, Marissa is going to keep spouting this nonsense, probably until the online papers go bankrupt. If Marissa is reading this she probably knows what I will say next and chances are she’s giggling that crazy giggle while she waits for it:

    “The living story” is a great way to do just three things very well: lose visitors, page views, and money.

    Why?

    If you constantly update the original story on say, the new virus attacking Granny Smith apples, without giving readers any indication that you added new content to that story (perhaps again and again), returning visitors will not go back and read it again – what for? – they don’t know it was updated – and no new visitors will see the updates, either, since they’re checking for *new* content linked to or visible from the top of the website’s main page.

    In addition, updated stories don’t get resubmitted to RSS/Atom so they cannot be not re-syndicated – thus your website’s feed readers have no idea you simply *updated* various posts unless they scroll down and read each one of your posts to see.

    Furthermore, people will not continually resubmit updated stories to social bookmarking sites like Digg. They can’t. If they did, they would soon be banned for spamming, since Digg enforces both duplicate URL and duplicate submission penalties.

    Perhaps most importantly, updated news kept at the original URL will not be placed at the top of Google News and/or Google Blog listings a second, third, and fourth time so no one searching Google for the latest updates on your “Virus attacks Granny Smith Apples” piece will be able to find it from search anymore – that post will be pushed down by the tons of other sites now reporting the latest updates on your story, even though you were (perhaps) the first person or newspaper on the web to write anything about the screwed-up apples.

    Marissa’s method also destroys your PageRank, because if you *don’t* link back to your original articles from your new articles, Google will see that you don’t consider your original post “an authoritative resource”. Internal linking is extremely important – essential to PageRank for the most important pages on your site – so it *must* be done if you’re going to win Google’s game. Marissa is telling you not to do internal linking by stating that you should keep each topic, updated, at one single URL only – you have got to ask yourselves “why”, people. Think it over.

    So…What will Marissa’s suggested method for maintaining and updating *your* website accomplish?

    1) No one knows you keep updating breaking news at the original URL it was posted on, so returning visitors don’t read the original posts again (why would they?), and new visitors have no idea you updated the story, either, so you’ve lost both new visitors and old visitor’s fresh click-throughs and page views

    2) Feed readers will not get notifications that your site’s feed RSS/Atom has updated because it has *not* updated – instead, you’ve edited the story but kept it at the original URL – feed readers won’t know that, so you’ve lost *every single one of them*

    3) People cannot resubmit your original (even if freshly-updated) story to sites like Digg again and again, so you have automatically lost up to 150k new visitors and perhaps double the amount of page views, depending on how big the news in your updates was

    4) Google News and Google Blog Search *do not* re-crawl updated stories kept at the original URL, so the tons of visitors you might have had from Google if you simply made a new post and linked to your original one from it are *poof* also gone

    5) Internal PageRank scores are destroyed because you are not doing essential internal linking to your most important and authoritative stories

    RESULTS: Without the new and returning visitors that Google and the site itself (through RSS and new posts at the top of the page, clearly visible to all) could have produced if you had simply *kept making new posts on any given topic while linking back to the original*, your website (or newspaper, as is the case) will lose tons of money since there will be that many less people to click on your ads. On top of that you will lose any untold amount of potential new fans and return visitors who do not know when or how you’re updating breaking news. Your Internal PageRank will also get hurt which will hurt your ranking in Google’s search results.

    The natural question that arises from a careful examination of the results of taking Marissa’ advice is:

    IF IT’S THAT BAD FOR US WHY DOES MARISSA TELL US TO DO IT?

    My guess? The Ice Queen knows what we don’t: that eventually newspapers will have to pay to show up in Google at all. In the meantime she wants to soften them up the idea by telling them how to destroy themselves. She knows damn well most newspaper people don’t know a drop of SEO, so they will follow her advice like the helpless sheep that they, in effect, are. Marissa is the very epitome of all that evil at Google that everyone keeps talking about.

  9. 11 Leigh Hanlon 2009 May 24 at 11:54 pm

    I like the “living story” idea. It would be interesting to see how it would affect editorial decisions. Have any publications adopted this yet?

  10. 12 Bertil Hatt 2009 May 25 at 12:13 am

    Sorry to bring disagreement here, but I actually love Marissa’s idea. Wouldn’t it be great to have an complete, up-to-date paper that has been edited to include (multi-layered) subtleties? Wouldn’t it be great, as a journalist, to have you ideas known as soon as you have them, and be still able to improve what you wrote?

    Some point out that this would kill traffic: I won’t. What kills traffic is not offering what your readers want, and beleive me, they want humble journalists who dare say they can improve they work; they want papers that a living and take into account expertise, instead of clipping it into a out-of-context quote. If you want readers to come back, have the browser/server remember what was the last version they read, and put the differences in bold red: those who know will skim them; those who though they know, but need more update have it all around. Put all the latest, major updates on the front page.

    At least. . . Try it: it’s not like you have another future that waits for you, is there?

    • 13 Neel Chopra 2009 May 26 at 9:56 pm

      Respectfully, I think the discussion of the living story idea to date fails to respond to the challenge that Mr. Young has raised.

      Mr. Young’s argument, as I understand it, is that the changes in the market for news brought on by the internet require us to think differently about how we organize access to news content. In a pre-internet age, news was not a true experience good because the consumer was severely limited in how much and what types of news he could experience. In the internet age, however, news is an experience good and the way we organize that news recognize that the market for news ought to reflect that.

      Accordingly, Mr. Young correctly points out, a little competition in the market for news (as opposed to competition between the news itself) would be good. Competition would allow an out-of-the box thinker to revolutionize how we think about the organization of news content. While the Google model is good for many things, its system is not designed to help people purchasing experience goods to us the experiences of trusted sources. When people want to learn about the experiences of others who have eaten at restaurants or seen concerts or theatre they look past the information that can be gleaned from a Google search because the rankings of anonymous internet users is not reliable enough. When one is getting information about an experience good, what he wants is information about the experiences of people they trust. This seems like such an obvious notion.

      The brilliance of Josh’s post is that it (1) correctly identifies news as an experience good and (2) applies the obvious truth that in searching for information about the value of an experience good, we demand information from a reliable source. This argument leads to the inescapable conclusion that in this day and age, the Google search is insufficient. While the Google search is valuable for a good many things, just as most consumers do not use it to sift through restaurants or theatre options, they will eventually look past the Google search in sifting through news stories.

      I am now convinced that what we are waiting for is someone to come up with an efficient way to organize news so that consumers can sort through the myriad articles about a given topic and determine which one would be best to experience without having to experience them all personally. Whatever the merits of the living story idea are, it does not seem to answer Young’s critique or meet this challenge.

  11. 14 Nico Flores 2009 May 28 at 9:43 pm

    Very nice post. But I disagree with the basic position – I don’t think news articles are consumer at all, either of the ‘search’ or ‘experience’ types. They are ‘industrial’ goods, chosen by people who are not their final consumers.

    We don’t first say “I want to read about the GM bankruptcy” and the go to a search engine and type “GM bankruptcy”. Rather, we say “I want to know what the news are”, then go to a place we know by name (which can be a newspaper site, Google News, or a blog we read for financial news), then learn about the GM bankruptcy via a headline that links to a story somewhere else, and then we follow the link and read the story on a newspaper site.

    Ad nauseam here: http://ondemandmedia.typepad.com/odm/2009/05/newspapers-are-in-the-breakfast-business.html


  1. 1 BuzzMachine Trackback on 2009 May 23 at 12:36 pm
  2. 2 News, Search Experience, and Value | The Noisy Channel Trackback on 2009 May 23 at 8:02 pm
  3. 3 Not by Links Alone - Search Engine Optimization Trackback on 2009 May 24 at 5:20 am
  4. 4 Search: Concrete – Not Hierarchical « Wir sprechen Online. Trackback on 2009 May 24 at 7:11 am
  5. 5 broadstuff Trackback on 2009 May 24 at 8:43 pm
  6. 6 links for 2009-05-25 « David Black Trackback on 2009 May 25 at 8:11 am
  7. 7 Free market for the news? Please. « Networked News Trackback on 2009 May 26 at 6:28 pm
  8. 8 Diversions about visual processing and information asymmetries. | Taylor Davidson Trackback on 2009 May 26 at 6:32 pm
  9. 9 Google, and the Problem of “Two Democracies” « J-School: Educating Independent Journalists Trackback on 2009 June 26 at 5:26 pm
  10. 10 Online News: la battaglia degli aggregatori Trackback on 2009 September 10 at 9:30 am

Leave a reply to Jeff Mignon Cancel reply




Josh Young's Facebook profile

What I'm saving.

RSS What I’m reading.

  • An error has occurred; the feed is probably down. Try again later.