Programming

Slow SPAs are worse than NoSPA

I got a digital subscription to the Economist for my birthday last month so I’ve started reading a lot more content on their site. As a result I’ve noticed a lot of weirdness with their page loads that was hardly noticeable when I was using the free tier of a few articles per week.

The site seems to be built as a SPA with a page shell that loads quite quickly but takes far longer to fill with content and which has some odd layout choices and occasional pops and content shifts.

The basic navigation between the current issue index and the articles is hampered by what appears to be a slow load or render phase. Essentially it is hard to know whether the click on a link or the back button has registered.

By replacing traditional page navigation the experience is actually worse. The site would be better if the effort going into the frontend went into faster page serving.

I’m not sure if the page is meant to be doing something clever with local storage for offline use but it seems to need to be connected when browsing so I’m assuming that this is something to do with the need for a subscription and payment gateway that prevents a fast server load of content.

It still feels as if the page and the 200 words or so should be public and CDN-cached with the remaining content of the article being loaded after page-load for subscribers.

The current solution feels like someone has put a lot of effort and thought into making someone that is actually worse than a conventional webpage and that seems a shame for a site with relatively little content that is mostly updated once a week.

Standard
Programming

No-one loves bad ideas

Charles Arthur has an interesting piece of post-Guardian vented frustration on his blog. His argument about developers and journalists sitting together is part-bonkers opinion and partly correct. Coders and journalists are generally working on different timeframes and newsroom developers generally don’t focus enough on friction in the tools that they are creating for journalists.

Journalists however focus too much on the deadline and the frenzy of the news cycle. I often think newsroom developers are a lot like the street sweepers who clean up after a particularly exuberant street market. Everything has to be tidied up and put neatly away before the next day’s controlled riot takes place.

The piece of the article I found most interesting was something very personal though. The central assumption that runs through Arthur’s narrative is that it is valuable to let readers pre-order computer games via Amazon. One of the pieces of work I’ve done at the Guardian is to study the value of the Amazon links in the previous generation of the Guardian website. I can’t talk numbers but the outcome was that the expense of me looking at how much money was earned resulted in all the “profits” being eaten up by cost of my time. You open the box but the cat is always dead.

Similarly Arthur’s Quixotic quest meant that he spent more money in developer’s time than the project could ever possibly earn. Amazon referrals require huge volumes to be anything other than a supplement to an individual’s income.

His doomed attempt to get people to really engage with his idea really reflected the doomed nature of the idea. British journalism favours action and instinct and sometimes that combination generates results. Mostly however it just fails and regardless of whom is sitting next to whom, who can get inspired by a muddle-minded last-minute joyride on the Titanic except deadline-loving action junkies?

Standard
Web Applications, Work

The myth of “published” content

Working at the Guardian you often end up having conversations with people about the challenges you face in scaling to meet the often spiky traffic you get in online media. One thing that comes up again and again is the idea that content, once published is essentially static. Now there is a lot to be said for this as digital journalism sticks pretty close to a lot of the conventions of print media; copy is often culled from the print version and follows the 24 hour media cycle quite strongly.

However what is often surprising is the amount of edits a piece of content receives, particularly if it is not a print feature article. The initial version of an article is often the mandatory information and a few paragraphs sufficient to get across the basic story. It then goes through a number of revisions that often happen while the article is draft. Often but not always.

Once the article gets published online though it triggers a new wave of edits as language gets cleaned up and readers, editors and lawyers all descend on it. Editors now have a lot more tools to see what the reaction of the audience to a piece of content is and see how it is playing in social media. You also have articles picked up externally and that means making sure the article works as a landing page.

Naturally stories often develop their own momentum that requires you to switch from a single piece to a set of stories that are approaching different aspects of the overall reporting. You then need to link the different pieces of content together to form a logic package of content.

One thing that is interesting is looking at how many articles are changed after seven days. It is a surprising number as new stories often create a need to create a historic context and often historical stories look dusty in the light of breaking events. We have also had strange things happen with social news where aggregating sites pick up some story that was overlooked at the time.

All of this means that you cannot naively treat content as static but in fact means that you have an interesting decaching problem as it is true that content doesn’t change much, until it does start changing and then it needs to reflect the changes reasonably rapidly if you want to be picked up by things like Google.

 

Standard