I was so craving for a coffee at around 9 o'clock in the morning when I went to the metro station of Ópera from where I took the train to the conference venue.

Luckily, the guys of Spain.js had already prepared everything and getting to the University campus, voilá – the coffee was ready for serving which got me quickly into the mood for another 8 hours of information loaded javascript talks.

Looking at the presentations' topics, the second day of Spain.js promised to sustain the momentum of the first day, and yeah we were going to see nine extremely diverse speakers on stage today, giving talks about everything from 3D over sound APIs to internal properties and parsers.

Read on...

"Javascript for makers"

If you're into Arduino, Rapsberry Pi and similar gadgets that are currently used by makers all over the world to build crazy stuff, you would probably be able to have an evening long conversation with Peter Christensen, who gave us an introduction to the subject of DIY and the role of javascript in this domain.

He says that he is seeing more parallels between the world of hardware and software, namely because hardware – just as software has always been – is nowadays a lot easier to access, distribute and build on.

Here's an interesting analogy that he drew:

"Software enables intelligent interaction with abstract things while hardware enables intelligent interaction with physical things."

After a short eletronics crash course, he suggested us visiting one of the maker fairs that are taking place all over the world, the closest one being celebrated in Bilbao next weekend. These are expositions of people who celebreate the art of creative engineering, building things such as self-reproducing robots or node.js driven submarines.

I found this talk very inspiring and – as many others – I would really like to dig deeper into the world of programmable mini computers if I had the time.

Slides of this talk can be found on Slideshare.

"Categorizing values"

Ok, so far the inspiring stuff. Now it was time for an in-depth, code-based Javascript talk. Axel Rauschmayer, a real guru in the world of javascript and who is constantly putting up interesting stuff on his personal blog 2ality.com introduced some internals that most of us are inconsciously using every day, but we might not really be aware of while building awesome things. Examples?

  • Objects are only equal to themselves, which means that {} === {} is false, but obj === obj is true.
  • There are two ways of prototypes: Each object has a hidden property [[Prototype]], and the type's prototype object when using constructors.
  • Understanding the difference between instanceof and typeof.

Fast-forward to the core of his presentation. The suggestion is to use these four methods of categorization in JavaScript:

  • Reading the internal property [[Class]]
  • via the typeof operator
  • via the instanceof operator
  • via the function Array.isArray()

In order to get a better understanding of the topic – as it is very complex – I suggest you to prepare a big cup of coffee and read the following blog post on Adobe Developer Connection.

"A particles system with the Dom"

What followed was probably the most entertaining talk so far at this conference.

Teddy Kishi demonstrated what is possible today with CSS3 and Javascript in modern browsers, showing us some of the awesome particle systems he has set up. These included simple experiments like a typical screen-saver-style space simulation to extremely cool animations of waterfalls, fire, snow and CATS.

All of this of course wouldn't feel as complete if he hadn't covered the popular topic of performance. So he put on several demonstrations of how smooth the particle system behaves if it's based on:

  • top/left position OR
  • 2d translation OR
  • 3d translation

I leave it up to you to guess which one is performing best and point you to the slides of the presentation where you can check out the examples yourself: http://teddyk.be/lab/domparticles/

"Behind the scenes of the SpiderMonkey Parser API"

After that Michael Ficarra took us on a journey through some of the internal functionings of what was the first javascript engine ever.

Besides looking at the parsing process and tokenization of the software, he provided a massive compilation of resources to third party tools in order to test, maintain, monitor and improve our javascript code. Some of these tools are listed below:

  • reflect.js - a JavaScript (ES3 compatible) implementation of Mozilla's Parser API.
  • Esprima - An ECMAScript parsing infrastructure for multipurpose analysis
  • escodegen - an ECMAScript code generator from Parser API
  • esmangle - a mangler / minifier for Parser API
  • mozilla sourcemap - a library to generate and consume the source map format
  • brushtail - Tail call optimisation for JavaScript
  • complexity report - a tool for reporting code complexity metrics in JavaScript projects
  • istanbul - a javascript coverage tool written in JS

And last but not least sweet.js, a JavaScript addon, which brings macros to the language.

You can check out his presentation on SpeakerDeck.

"Manage Those dependencies"

Jakob Mattson then challenged the difficult task of waking us up after the lunch break, and yes he did a very decent talk. It was also one of the best in my opinion regarding structure, amount of information provided, as well as quality and simplicity of the slides.

He introduced several tools that allow us to manage dependencies in complex javascript applications, and to make it short, I'm gonna put together a summary of only the cons of each of the applications are:

  • Ender: Don't use it, because it puts all imported libraries into one single namespace, namely the $ symbol, which is extremely inconvenient when you use jQuery
  • Browserify: Despite being good at solving the problem of nested dependencies, it bundles everything into one file.
  • Require.js: My personal favorite, but Jakob pointed out a disadvantage: It makes it a bit harder to optimize your application
  • Jam: This tool claims to put the browser first but still doesn't have a huge index and adds some packaging overhead
  • Bower: The solution provided by Twitter is actually an index for packages and thus, won't take you all the way.

His presentation can be found on SpeakerDeck as well.

"Serving billions of requests with node.js"

Tom Buchok didn't use a microphone and was one of the few speakers who actually used the stage space to its full potential. You could feel the drive and passion with which he was talking about node.js, Redis and the extent to which he has used these tools to create highly performant web applications.

In order to properly serve dynamic content – as opposed to static content which can better be served off CDNs – he suggests to use streams in order to decrease latency. Since node.js has built in support for downloading data in chunks, this approach allows you to run many concurrent downloads on a single web server.

Something that web developers don't keep in mind is that error handling matters. Tom pointed out a few scenarios where this can be crucial, such as when sockets are timing out. So passing around errors and dealing with them is an important issue. What's equally important is visibility, in other words to do monitoring, performance measuring and testing. To give an example, many people don't care about exact version matching when adding dependencies to their package.json files, being a potential for bugs that can affect parts of the deployments.

So far, I haven't found the slides of this talk, but you might want to track the event on Lanyrd to be notified once they are available

"Threedee tales of Urban Bohemian"

The next talk took me back to the days when studying for a computer graphics exam at University, and as Vicent Martí stated in of the first slides: 3D is hard – and I thought: Yes, I remember.

We then went through the process of building an actual videogame rendertree, at a pace that made taking notes a big challenge. The WebGL library which is part of some modern web browsers can render interactive 3D graphics accessing the computer's GPU. It is however NOT an API for 3D graphics, and it is neither an OpenGL wrapper.

Setting up a canvas is as easy as writing creating a canvas with canvas.getContext('webgl');

According to Paul Irish's research, you should however use the requestAnimationFrame callback when animating things in the browser, because it will reduce CPU, GPU, and memory usage.

After a short detour to the world of coordinates, matrices and camera perspectives, we got to the topic of shaders, of which the most current and decent browser implementation is the OpenGL Shading Language (abbreviated: GLSL), which – according to Wikipedia – gives developers more direct control of the graphics pipeline without having to use hardware-specific languages. Now in order to deal with this kind of technology on a JavaScript level, Vicent recommends using the three.js library, since it abstracts away the headaches of using 3D in the browser.

Still – he summarizes – is 3D a hard to understand and hard to learn topic. I agree.

"The amazing sounds of JS"

What we saw afterwards was a brilliant demonstration of JavaScript's Sound API, which is – according to Stuart Memo – the greatest invention of the last 100 years.

The guy from Glasgow went on stage, equipped with a guitar and a keyboard. And if Teddy Kishi gave the visually most entertaining speech, this was its acoustic counterpart. After firing up some sine wave and square wave oscillators from the console in order to reproduce basic synthsizer sounds, he showed us some amazing tools he has built on top of the Sound API that I would like to share with you.

After earning thunderous applause for performing some hits from music history on his funky inventions, he still proved cool enough to ask the critical question:
"Why?" – The presentation alone was reason enough, I would say!

"Hunting down memory leaks"

To round off the day, we got to hear another technical talk, this time from Jozsef Ferenc Pengo about JavaScript performance in today's web browsers. Let's start with one of the vital statements of his talk:

"More memoy is not equal better performance"

There is no algorhythm than can decide what you need and what not. Some of the most common mistakes that we do as web developers that lead to memory leaks are the following:

  • Browser bugs (Internet Explorer)
  • Accidently keeping references, such as when not removing DOM references, event handlers or timers
  • Framework misuse

He suggests to use the Chrome Developer Tools and make extensive use of the features "Memory Timeline", "Heap Snapshot" and "Record Heap Allocation" in order to track down potential memory leaks.

Here's the link to his presentation on Prezi.


To finish off the evening, we were invited by the Atlassian guys to come to Aleia Chillout in the vibrant district of Chueca. With free mojitos and mexican food, it was the perfect place to unwind and review the conference with some like-minded fellows.

In my opinion, the conference was a big success. We saw a great variety of different speakers, some well-known names and some potential newcomers. Personally, with all the notes I have taken, I still have a long way to go looking deeper into all the techniques and technologies I have learned about. It is also very inspiring every time to visit such a conference with so many people sharing the same interests.

Regarding organization, catering, timing, wi-fi availability and speaker quality, this has been a really outstanding event as well. I hope that in 2014 we will see another Spain JS and with this in mind, kudos and congratulation to the organizers for all the efforts to put on this great event. Cheers!