Blogging, Programming

Clojure Exchange 2016

At one point during this year's Clojure Exchange I was reflecting on the numerous problems and setbacks there had been in organising the 2016 exchange with Bruce Durling and he simply replied: "Yeah it was a 2016 type of conference". So that's all I really want to say about the behind the scenes difficulties, despite the struggles I think it was a decent conference.

Personal highlights

James Reeves's talk on asynchronous Ring was an excellent update on how Ring is being adapted to enable asynchronous handlers now and non-blocking handlers in the future. I didn't know that there isn't an equivalent of the Servlet spec for Java NIO-based web frameworks.

The Klipse talk is both short and hilarious with a nicely structured double-act to illustrate the value of being able to evaluate code dynamically on a static page.

David Humphrey's talk, Log all the things was pretty comprehensive on the subject of logging from Clojure applications. It was one of those talks where you felt "well that's been sorted then".

Both Kris's keynote and Christian's Immutable back to front talked not just about the value of Clojure but how you can apply the principles of Clojure's design all across your solution.

One of the most interesting talks was a visualisation of prisoner's dilemma strategies in the browser. It was visual, experimental and informative.

Henry Garner's data science on Clojure talk was interesting again with some nice dynamic distributions and discussions of multi-arm bandit dynamic analysis. Sometimes I feel lots of the data science stuff is too esoteric with too little tangible output. This talk felt a little more relatable in terms of making dynamic variant testing less painful.

Disappointments

Not everything sings on the day. Daan van Berkel's talk on Rubik's Cubes suffered a technical failure that meant his presentation was not dynamically evaluating and therefore became very hard to follow. We should have tried to switch talks around or take a break and try and fix it.

The AV was a general rumbling problem with a few speakers having to have a mic switch in the middle of their talks.

Hans Hubner's talk on persistence was interesting but too quick and too subtle.

We should have had the two Spec talks closer together and earlier in the day. The things that people are doing with it are non-trivial and it is still a relatively new thing.

clojure.spec

Spec is kind of interesting generally for the community. It has become very popular, very quickly and it is being used for all kinds of things.

One theme that came up in the conference was the idea that people wanted to share their spec definitions across the codebase. This seems a bad idea and a classic example of overreach, if someone said they defined all their domain classes in a single Java jar and shared it all across the company then you'd probably thing that is a bad idea. It's not better here because it is Clojure.

The use of Spec was also kind of interesting from a community point of view as the heaviest users of Clojure seemed to be doing the most with it. The bigger the team and the codebase the quicker people have been to adopt Spec and in some cases seem to switch from using Schema to Spec.

On the other hand the people using Clojure for data processing, web programming and things like Clojurescript have not really adopted Spec, probably because it simply doesn't add a lot of benefit for them.

So for the first time in a while we have something that requires some introduction for those new and unfamiliar with it but is being used in really esoteric ways by those making the most use of it. There is a quite a big gap between the two parts of the community.

The corridor track

Out of the UK conferences I went to Clojure Exchange felt like it had the best social pooling of knowledge outside of Scale Summit. Maybe it was because I knew more people here but the talks also had all kinds of interesting little tips. For example during Christian's talk he mentioned that S3 and Cloudfront make for one of the most reliable web API deployment platforms you can choose to use. I ended up making a huge list of links of reminders and things to follow up on. I've also included links to lots of the Github repos that were referenced during the talks.

Next year

And so with a certain inevitability we are looking to the next Clojure Exchange. We're going to have a slightly bigger program committee which should make things easier.

The other thing that we didn't really do that well this year was to try and have some talks transfer from the community talk tracks to the event. In 2017 we'll hopefully be more organised around the community and also have a series of talks that are tied in to the conference itself. If you're interested in being involved in either the organising or the talks you can get involved via London Clojurians.

See you there!

Standard
Programming

/dev/winter 2015

The Dev Sessions are a Cambridge tech conference organised by the same people who do FPDays. The conference was free, held on a Saturday and was based in the Moeller Centre near the Churchill College campus. The only practical way to and from the station was via taxi (befriend those on expenses, thank you John Stevenson).

The talks were on broad topics relating to development. I had pitched a talk on Developer Autonomy, something I'm engaged with in the day job.

Misjudging the train times I arrived a little late and jumped in to the talk on using graph databases in game design. This turned out to be a much more general talk about how the speaker had created tooling to support the game designers in his job. Being a fellow tool provider my interest was immediately piqued.

The game the team were building was some weird monster trapping game, something like Pokemon but more complicated. To trap monsters you need a trap, a lure or bait and you would need to craft both so acquiring recipes and components. Trapped animals provide you with components for other baits and traps and a monetary reward.

The talk was pretty wide-ranging, they were using Neo4J to analyse circular dependencies in "quests" to capture monsters. When designers changed the game data it would get loaded into the graph and all the dependencies checked that they are like a tree (flowing forward) rather than having inter-dependencies (circular references).

It was also possible to generate a "map" of everything in the game and what elements of the game were central and which were on the periphery (which should be the high-level monsters near the end of the game).

All the game data is in text files that are stored in Git, the developers had built a tool over the VCS that simplified the presentation of the many JSON files but it was also possibly for designers to edit them directly with whatever editor they favoured.

All the game data then gets built, validated and packed so it can be shipped off to the servers to power the game.

I think, if I understood the talk correctly, that the build also includes the localised text which is then powered from the server rather than updating a binary datafile on the client.

The final really interesting part of the talk involved the use of genetic algorithms to try and create game data. Data is captured from the game indicating what percentage of the players have captured a particular monster. The designer can then enter the percentage that they intend to capture the monster and the program goes off and tries to generate variations on the monster stats and trap requirements that it predicts will be more achievable by players. If any suitable combinations are found the designer can review them and choose the one they prefer.

Again having selected some changes these are applied to the data files via the tool and then packed and shipped.

It was a really interesting talk about how engineers can make a real difference by building tools and was completely undersold by its title.

The Mixcloud talk on scaling on a bootstrap budget was very interesting as most talks on scaling are about reliability, volume and throughput. It is very rare to get one that focuses purely on trying to create the lowest cost stack.

One of the key things they do to achieve this is a lot of capacity planning with just-in-time rental, buying capacity just ahead of rising usage, something that is much easier when you have a focused product with a limited scope that all your engineers can focus on.

They were also using some interesting hacks like ruthlessly using their right to renew contracts to make sure their applications ran on the newest hardware that was being brought into the datacentre instead of staying on the older blades. A few of the other things I'd heard of before: like setting your requirements so you require individual boxes and therefore do not share your infrastructure with someone else instead of building smaller services with numerous deployments.

There were a few blanket statements that I didn't agree with. For example S3 was condemned as being "expensive" when its really not the more nuanced statement is that S3 bandwidth is expensive and it really is more of a storage solution than something you use to directly serve the public at scale.

One of the big domain specific issues was around streaming audio files, of which, intriguingly was the idea that when you serve the files the connection is so fast you serve the whole asset to the browser when the user is perhaps only going to listen to ten seconds to see if they like it.

A lot of the talk was really about building a single point of presence CDN on the cheap. I did wonder if there wasn't something smart to be done with servers that regulated the downloads more evenly or using a customer player and streaming format.

I stopped by the Julia introduction and there was some interesting points but it was very slow. Julia is quite an interesting language though and I should spend more time with it.

The final talk of the day was on "smells" in automated testing. I thought this would be an interesting topic because I think automated testing was hard but a combination of obscure slide illustrations, fairly old testing strategies and dodgy OO-code examples at the end of the day resulted in a talk that was side-tracked. Testing is hard, and since test code is code then it does not seem worth calling out tests as something special within a codebase. Writing good test code means writing good code and applying the same scrutiny of solution design to the test code just makes sense.

Two things that were not mentioned in the talk but which I think matter when you are talking about the subject as a whole are monitoring and generative testing. I think any talk about testing now needs to cover an approach to generative testing, the old world of testing examples and specifications might be helpful for illustrating code but should not be considered as really being proper test code.

Things that can be extremely difficult to test might be trivial to monitor. Time spent understanding the performance of code in production can be just as valuable as investing a lot of time in creating complex test code.

The whole day was full of interesting talks and bits and pieces and I'm definitely interested in trying to make the trip to the summer version of the event.

Standard
Games

Feral Vector 2014

I decide to take a break from my regular technology concerns and take a day off to visit indie games conference Feral Vector. The conference program was packed through the whole day with 20 minute talks (making it a little tricky to judge when to leave for things like lunch). The programming was really good and the shorter format made it feel more lively that other recent events I’ve been at like State of the Browser.

The venue was the Crypt on the Green aka the crypt of St.James in Clerkenwell, which is a pretty great venue and particularly in the summer has the advantage of having the church grounds/public park. In terms of layout though the talks were in the sideroom with the far larger room being given over the game demos. So the talks were packed all day while the main room always felt empty. A swap on the day might have worked a lot better. The small mezzanine used as a tea room also ended up feeling like a sauna.

In terms of the demos I liked the folk games tutorial of Turtle Wushu and I real liked Night in the Woods which felt like a platforming Slackers that was in the same cultural space as Gone Home. Hohokum was very weird, it definitely has that play feel but lacks enough feedback to make you feel like you’re actually interacting with the world.

I didn’t see all the talks so I’m going to talk about the ones that I liked. Standouts were Tim Hunkin on the arcade game booths he builds. The units are witty takes on conventional games of physical strength and dexterity. Adam Hay gave a great overview of how music and audio design has developed in videogames and was the closest that the event came to a technical talk. The explanations of the different synth chips versus sampled sound was interesting along with the way that sound design was initially a technical challenge due to hardware but then becomes a simulation challenge once hardware ceases to be a limit.

There were a few performance pieces (and quite a few journalists or writers in here): Christos Reid oscillated between confessional and an analysis of autobiographical games, making lots of good points but never really being clear about what any of it meant. Alice O’Conner mixed spoken word performance of mod readme files with her confession that she was losing interest in gaming and an awkward attempt to contextualise the readme file writers; the recital element was the strongest. Hannah Nicklin gave the strongest performance on the subject of how games break for her but the strength of the performance robbed the analysis of power as you ended up appreciating the delivery but feeling that the material lacked the depth and reflection it deserved. You felt that it was more about hitting a beat than exploring an idea. Near the end of the day James Parker did a puppet show Q&A that took a few easy pop shots but was also laugh out loud funny; his turn actually did the best job of marrying form and material.

Tammy Nicholls talked about world building and how it is valuable to both game depth and play as well as the commercial aspects of intellectual property but never really got into the details so I felt I was hearing half an argument. I often find that a good narrative and deep background carries you through some poor gameplay but perhaps it is undervalued in terms of game development.

Luke Whittaker talked about working on the game Lumino city and why for this game and previous game Lume the studio have focussed on physical materials translated into a game format. While the details of laser-cutting cardboard to make the city were fascinating I’m not sure really whether any meaningful justification for the approach was offered except the aesthetic which seems to be partly a nostalgia for a certain era of animation. However there’s no denying that the aesthetic is unique and its worth looking through the screenshots on the site.

There was also an interesting piece on combining art styles by SFB Games and collaborator Catherine Unger which again had a little bit of technical detail as to the issues and the solutions.

Finally there was a talk about physical puzzle rooms, a genre I don’t like even in digital format, but it did mention the interesting intersection between immersive, participative theatre and physical gaming. This was relatively new ground for me (although obviously people have raved about Secret Cinema). I was interested by the idea of things like 2.8 days later and the Heist. Not enough to want to participate yet but definitely more curious about the possibilities.

I think the talks were all recorded, although the room was often in darkness to make the projection work so I’m not sure how that worked out.

Standard
Programming

Scale Summit 2014

Scale Summit is the new Scale Camp, an unconference aimed at bringing the same kind of topics as you might expect at Velocity.

This was the first Scale Summit, the venue was excellent as was the food (especially the bacon rolls, from Eden apparently) and supply of drink. Scale Summit happens under Chatham House rules so there’s no attribution of what is said which allows the attendees to be really frank and also for people to be free with what they really know rather than hedging and trying to be “on message”. It makes for a fascinating gathering.

The sessions varied in their organisation but all focussed on discussion between the participants. I managed to go to the Elasticsearch session, which was interesting for the practical boundaries that people were finding and also the operational knowledge. On the subject of using ES as the primary application store, the feeling seemed to be “not yet”, but there was also some words of wisdom about separating out document stores and search functionality and not finding a superficial unity in the two purposes.

The microservices session was a fast and furious fishbowl, easily the liveliest event and one that is going to require a post in its own right. It was interesting to see that the room split into practitioners and people who were sceptical that microservices were a thing or held value over conventional service development.

After lunch I sat in on what can be done to get frontend testing off the critical path to production (not much now but clearly more effort needs to be made), distributed DOS attacks on transactional sites (not as scary as I imagined but again we have to be thinking about how this works), distributed data stores (good war stories, felt better informed for going), getting ops and developers to work together and Linux containers (definitely going to try Docker now).

I had quite a few questions going into the event and while I didn’t get all the answers I hoped for I did at least establish that smart people don’t have simple answers to them either which is reassuring. It’s hard to tell in the heat of it all whether you’re on the edge of things doing things that are pushing the boundaries or simply over-complicating your situation.

The attendees were nicely mixed and from a range of backgrounds, ops, architecture and developers were all well-represented so you felt you were seeing a rounded situation.

The unconference format left me wanting more rather than feeling I had had enough. The openess was amazing and I am planning on being there next year.

Standard