He then mentioned how JQuery became a dominant framework but turned into a monolith.
This meant that JQuery soon became too big for most sites and purposes.
This gave rise to use of micro frameworks which do one job for each framework and keep things small.
This is the same approach to coding as how Unix people approach coding.
Q&A reaches earlier, so lots of question time is left. Conversation goes into free flow.
Don’t change all libraries in all projects. Settle on ‘go to’ libraries for many different projects.
Things like ‘ender’ try to unify things anyway. Build against the ender interface and that way you can get code reuse.
What micro frameworks would you recommend?
There are a few good ones by authors of ‘ender’. There is a functional library called ‘valentine’ and ‘bonzo’ and ‘bean’ which are not too bad for events and DOM stuff.
‘Query’ and ‘Request’ which are spelt weirdly which are good for AJAX stuff. But it all depends on your use cases. Check out microjs.com for an index of frameworks. Use that as a first step.
Doesn’t gzipping remove the size and reduces things to 18K?
Yes, but micro framework approach brings things down to 7K.
But JQuery is going to make sure things are cross browser compliant, so why should I switch?
These things are going to get better and better in time, so it will become even better as more people contribute.
Doesn’t ‘ender’ do something special to ‘give things back’ to other frameworks, so this is extra work surely?
Ender is a node.js application, so it runs in the command line and piggy backs on NPN packaging, so when you build something with ‘ender’ it will download all frameworks via the NPM repository, save them to local machine, then combine them together with ender adapter and then run through uglify.js to give you compressed version.
Then you can add additional frameworks to your build and upload a single file to your live build.
Considering the difference is 10-15K, is it true that this makes difference to client side performance?
Yes and no. Theoretically it will be faster to load the microframework, but because of size of files will not be noticable as other factors. You will probably notice it in other areas.
But this defeats your point though?!
That was in the parsing of the file though.
The physical size of the files is not that different, I would say the interpretor is much more of a factor here?
Modern JS interpreters don’t really care about the size. Benefits come from reduced HTTP overheads.
CACHE sizes on some mobile devices also play a part. HTTP requests are the main issue here.
How does this differ from using JQuery and other JS stuff and splicing into the same JS file to deploy live?
It doesn’t make much difference, but micro frameworks means the file sizes are smaller and less information is served on each request, saving server bandwidth.
Justin decided after trying many different things decided to use d3.js for his graphing purposes.
d3 mailing list is really nice and friendly, so when you are starting up with it, you will have an easy time.
Use presentations from HTML5 rocks, as it’s really nice.
d3.js is about code and not configuration, so it’s nice to work with.
d3.js uses little micro frameworks to build graphs with.
Justin showing a practical example of code.
There is a CSV parser built into d3 and it’s easy to work with CSVs using this library.
Justin shows a really nice example of a dynamically updating graph with transitions, that looks really nice and is rendered in SVG.
He followed by showing a whole bunch of live and dynamic diagrams which can only be understood if seen and cannot really be described in words.
Rick Shore library is built on top of d3 which is really handy, built in a micro framework fashion with easy way of creating common, standard graphs.
All this stuff is really micro frameworky which is nice.