Jacques Mattheij

Technology, Coding and Business

Highschool Reunion, Bulllies And Being A Nerdy Kid In The 70's And Early 80's

A while ago I received an invitation to a high-school reunion. I didn’t go and I probably will never go to a thing like that. I’m trying very hard to forget that period.

I love to learn, but it wasn’t always so.

As a small kid, up to the age of 12 I definitely did. During the summer holidays that I would spend with my paternal grandmother in Arnhem I’d eat my way through the public library technology section, and when I was done with that I made a deal with a second hand bookstore called ‘de Slegte’ that if I could read a book I bought in the morning before closing time I could trade it in for free on another book, and if I could read that one before morning I could do it again. So for the price of one book they allowed me to read their whole catalog, unless I wanted to keep a book, then I’d have to pay again (I kept plenty of them). I read a lot of books that way and inevitably some of the stuff I read stuck and interlocked with other bits and pieces that I already knew. Communications Satellites, computers, rockets, evolution, whatever was available, I’d lap it up ever hungry for more.

Grade school was misery, lots of bullying, lots of fighting, but there was the promise of getting into the VWO, the school where you could prepare yourself for a career in science. Provided you had good grades and provided you worked hard at it. One of my uncles, who had worked for Philips Electronics in Eindhoven taught me a bit of calculus and I loved it, this went way beyond the grade school arithmetic.

Finally, in 1977 I managed to leave grade school behind me and went to high school, a school with a Christian background (even though I did not believe in any god, my mom thought it wouldn’t hurt to send me to a school with kids from an environment like that). The introduction week was pretty harsh. I ended up cycling home on my own because of all the fights. Incredible, this was supposed to be the thing I’d been looking out for for so long. Prototypical nerd, I got into trouble from day one because I stood out in every way that you could to make you impopular. Small, scrawny kid, too clever for his age, not yet interested in girls or any of the other things all the other boys were going on about.

When I was 12 color televisions existed but not every household had one, VCR was about to break through and computers were for the most part locked up in large buildings with an army of priests and acolytes tending to their needs. In many houses the most prominent item in the living room was a gigantic column of hi-fi components hooked up to a pair of enormous speakers.

Even though computers were my long time ambition between the ripe old age of 8 and 15 or so I did the next best thing: tinker with electronics. The cold war was in full swing and there was quite a bit of propaganda intended to strike fear into the population with respect to our imminent annihilation and this affected me to the point that whenever I was at work in the basement of the house and a plane would fly over low I would sneak a peek to make sure that it wasn’t a long range bomber.

Being rather chronically short on pocket money (tools are expensive!) my method of acquiring parts was a bit unconventional, I would get up very early on Tuesdays and Fridays, race with my bike ahead of the garbage collection trucks and strip any and all electronic gear that I spotted near the kerbside of their circuit boards and stash the boards in my newspaper carrier bags. If something was either unknown, interesting or repairable then I’d take it home in one piece. Then I’d go home and dump the loot and off to school (I usually managed to get to school before the bell but not always and I’m pretty sure that Tuesdays and Fridays would stand out on my absentee reports for no apparent reason to the school officials).

The boards were stripped using a blowtorch and a water bucket (heat board in one go, slam down and all the parts drop straight into the water for instant cooling), it was pretty much a reverse-assembly line. Stacks of sorted parts lined the walls and a pretty crummy assortment of tools completed my arsenal. I had the most terrible soldering iron in the world, it leaked current like there was no tomorrow, one of those transformer based soldering guns. Very bad for soldering anything statically sensitive. They also tended to melt down with prolonged use, they were actually intended for very short bursts of activity.

On one evening every week there was a gathering at a local community house where more advanced electronics hobbyists would teach us newbies the magic of making your own circuit boards and doing some more advanced designs. For the first couple of years my career in electronics mostly consisted of ripping stuff apart and I was as surprised as anybody when I finally managed to actually repair something (iirc this was a radio cassette player owned by a girl in my high-school class that had blown up its transformer but was otherwise in good health judged by the fact that it would still work on batteries).

Meanwhile, in school things got a lot worse. I got the highest score for the test that separates grade school from high school, and I wasn’t about to start slacking. So I worked as hard as I could on the stuff they threw at us in that introductory year. The only subject I had trouble with was French, for the rest I pretty much aced it. Even though I did work hard it didn’t look like I was working all that hard. Not long after the nicknames started. Teachers pet, professor and so on. There was one other guy in that class that scored about as good as I did on most subjects, a kid called Cornelis. He had a buddy that looked at 12 like he was 16 or so and nobody dared to pick on him (good for him!), but unfortunately because I came from another part of town I had no friends at all right in the beginning of the new school year, which made me into a convenient target.

Tons of fights, mostly with a small group of kids that enjoyed disturbing the lessons and that were definitely not in this school to prepare themselves for anything in life, other than that their parents thought they should go there. Everything about me was wrong to them, from my taste in music to my clothing to the fact that I didn’t care about team sports (but I did love cycling and ping-pong), the fact that I actually enjoyed learning and so on.

I got through the first year with top marks but felt totally miserable. During the summer holidays I figured out (wrongly) that if I’m going to survive this school I’d have to somehow adopt some kind of camouflage. The easiest way to do this was to get bad grades. I stopped studying completely, all I took in where those things that I heard during the lessons in class, my books remained pretty much unopened that year. I flunked the year, French and now German were my weak points, I think I got a ‘1’ (the lowest mark possible, in nl marks were from 1 to 10 at that time) for it, to great anger of the teacher who saw that I was purposefully not doing anything to learn. (my German is passable these days so it all ended up well in the end).

Flunking the year was bad, but it had one silver lining, a reboot. New people, a bit younger so I wasn’t standing out as being all that scrawny anymore. The darker part of that cloud was that this particular class had a bunch of bullies that was much worse than the ones in the class that I left behind. More fighting, suspensions and other trouble because of that. But at least I finished the year because I already knew most of the material and even cracked a book every now and then. I actually liked the new classes (physics, chemistry) that that year brought and after doing the basics in terms of homework got pretty good grades for those.

On to the third year, this one went pretty good for the most part, I had found what the minimum of work was that I had to do to get the minimum grades I needed to ‘pass’. Teachers still hated me because it was pretty obvious that I could do better. I spent a lot of time in school designing electronics circuits, got a paper route and eventually bought my first real computer with it (a TRS-80 pocket computer made by Sharp, which looks like an oversized calculator). So plenty of distractions but not enough to make me fail another year.

The fourth year the bullying started again. This time it kept on escalating and it would not go away. One particular guy (who was always going on about his karate expertise) thought that it was fun to kick me out of sight of the teachers under the table during the lessons egging me on to do something about it. When I finally snapped after a couple of months of this and put him in hospital I was suspended for three months. So I failed that year as well. Right about this time my family situation detoriorated to the point where I had to leave the house I was living in to move in with my dad.

Him and his new wife did what they could to accomodate me but there was enough friction in the household that that didn’t last for more than a year (which I spent on a trade school for electronics called the ETS in Amsterdam, I recently met again with one guy I met there and it was quite an amazing meeting), and after that my mom split up with the guy she’d been with and I rejoined her and went to work for a big dutch bank and my formal education stopped. I still did a year of nightschool to try to get to equivalency but the work I did was so physically exhausting that I could barely walk home, let alone do a night of clear-headed studying.

From absolutely acing the tests to being a person without any formal qualifications in one smooth move, from loving to learn to hating to learn (fortunately I branched out into loving to learn about electronics and computers instead of the stuff I was supposed to be studying in school, which laid the groundwork for the rest of my career).

Highschool can be hell, especially when you’re looking forward to it.

To all the bullies that I encountered during that time (you know who you are): Thank you for hardening me in a way that serves me good to this day. I don’t blame you for me not finishing my education but it certainly did not help either. To all current bullies in highschool: you are damaging a lot more than you think you are, it’s not just fun and games. One of the people that found great pleasure in hounding me and a couple of other ‘nerds’ is still visiting a psychiatrist because of what he did back then according to an ex-girlfriend I met at a wedding. I’ve long ago forgiven him, but apparently he still hasn’t forgiven himself. Funny how life can turn around.

Highschool reunions are not for me, too many horrors and too good a memory are a bad recipe for rehashing old times.

Defending The Right To Be Forgotten

One of the hotter topics of discussion at the moment is ‘The Right To Be Forgotten’. The basic idea is that individuals should have control over their online image.

Because ‘Google’ is now the front door to the web, for once the European Union got something right and concentrated not so much on the websites that may post items about a person but on the search services providers that caused the information to be found. After all, if Google is the front door of the web it no longer really matters what’s on the web, if you can’t find it via Google with a 10 second investment in time then it might as well not exist.

Predictably, Google disagreed and a lawsuit ensued culminating in an all out confrontation on the subject with the French CNIL.

On the one hand I find myself agreeing with Google, if the right-to-be-forgotten is implemented the way the EU wants it to be then the predictable result will be that the majority of the requests will be used as a whitewash for regimes, politicians and other people of power to shine up their image.

On the other hand, from the view of an individual that has done something wrong at some point in his or her life and that has through either a fine or a jail sentence paid their debt to society it is kind of annoying that they’ve been declared pretty much un-employable by Google or one of their lesser competitors.

So there has to be some kind of balance struck and it is extremely hard to codify such a balance in law.

I patently disagree with the ‘free speech’ arguments that are immediately bandied around, the way those are enshrined in the American Constitution is all fine and great but that is an American element and as such an American company doing business in Europe is perfectly within its rights to say that they don’t want French legal concepts to be made to act the world over then that should be mirrored by not bringing American style free speech into it either by forcing that concept upon the rest of the world. What’s good for the goose… If the subject is a resident of France then the French courts are clearly within their sphere of influence to make statements about that, if they wouldn’t be then by default we’d all be living under American law right now and that should be something limited to the citizens of the United States. And such a French ruling should extend to all arms of American corporations, especially if they are using their status of foreigner to play games of whack-a-mole with the local judicial entity. (Google in particular claims that ‘the right to be forgotten’ should then only apply to google.fr but that’s a very cheap trick in my opinion and one that is going to incense the court.) Search engine results aren’t ‘free speech’ to begin with in my opinion, they’re re-hashings of someone else’s speech (or - for the most part - writings) and what the courts are saying is that when it concerns their citizens they (and not some corporation) will determine what is and what is not legal when it comes to collating data, as it should be. Note that such a collection of links about a subject (if the subject is a natural person) constitutes a database in the sense of the EU DPD and there already are provisions in there that require the creators of such databases to comply with removal, correction and reporting requirements. So the right to be forgotten is a very basic interpretation of the EU DPD and it did not actually need any new laws at all.

If ‘the right to be forgotten’ is going to be useful then it should concentrate on the following:

  • It should only work for people, never for corporations or for people acting on behalf of a corporation in some capacity. So Royal Dutch Shell should not be able to use the right to be forgotten to whitewash it’s image with respect to for instance their deailings in Nigeria.

  • it should only work for those people whose lives are otherwise ‘unremarkable’, in other words they should not have claims to fame other than the particular entries that those requesters are asking to be ‘forgotten’. So if you are a famous pianist and you get a bad review then that’s unfortunate but it is simply a fact of life that not all reviews will be positive. Ditto for politicians and others who cherish the limelight but wish to remove certain items from their career history to make themselves look better. If you want to be in the public eye then everything that happens to you is fair game.

  • it should work across all properties owned by a search engine provider to avoid ‘shopping for a favorable law’ by search engine providers essentially doing an end-run around the rights afforded by governments to their citizens.

The right to be forgotten should be used as a scalpel rather than as a hammer and I feel with these relatively small modifications the spirit of the ‘right to be forgotten’ could be made to work and most of the criticism against it would be laid to rest.

Retention Is The Key

It seems to be the only thing that web millionaire wanna-be’s care about is growth. It’s true that growth matters but it isn’t always obvious how you can grow a business without either sooner or later imploding it by running out of funds or what to do when growth inevitably plateaus.

When a company is just founded growing is easy. After all, from 1 user to 2 users is 100% growth and you should be able to do that in a day or so, maybe even a couple of hours. But from 2 to 3 is only 50% more, and from 3 to 4 only 33% and so on. With every user that you add there is less growth so that’s one kind of downward pressure on the growth rate. Being larger it is harder to grow at the same %age rate than it is being smaller!

The next kind of downward pressure on growth is that when you’re still small you can keep a lot of attention focused on the few users that you have. They’ll feel fantastic and they will pass the word to their friends that you guys and girls are the real deal, you understand customer service and you actually care. And then you grow and you reach a level where you have to reduce that attention because there is only so much of you and you can’t afford the personnel to look after all these new users. And so the fountain of new users slowly dries up. Another kind of downward pressure on growth.

Slowing down growth even further is that all customers go through a life-cycle where they initially become acquainted with the product, they will purchase some and eventually they go away again, either because they’re dis-satisfied for some reason, have found something better, lose interest or simply die. What goes up must go down, but when it’s early days and all customers are still in the first part of their life-cycle it seems as though things can only go further up. As soon as the company matures and the age of the company starts to approach the average length of the life-cycle of a customer this will change (sometimes very rapidly) which will add another factor to the brake on the growth.

The interesting intersection of these three factors is that even if growth in an absolute sense is harder as you get larger growth in a relative sense is a lot easier if you simply keep your customers happy and retain them for as long as their biological lives. Retention, it turns out is a key element in how big a business will become and how fast it will reach either profitability (if that’s the initial goal and it helps if you can switch to ‘profitable’ mode at the drop of a hat in case the economy turns against you in some way) or how fast it will achieve critical mass (if network effects are important) or market domination.

First off, let’s define what retention actually means: it means that if a customer is ready to buy a product like yours and they have bought a product from you in the past that you will be their first choice. In the context of a subscription style business retention is measured as the number of times that customers renew their subscription (for instance: every month).

Let’s do a numbers example with poor, average and great retention on an otherwise un-remarkable business with fixed costs-of-acquisition, a fixed marketing budget (so the same number of new customers attracted each month) and a fixed gross profit margin of 20% (though in practice, especially for services the profit margin will likely improve as the company gets larger due to economies of scale, in a company shipping physical products that effect will be present too but less strong).

The example is an online service or a store where people on average buy something worth $30 every month. For the sake of simplicity we’ll keep all the numbers fixed except for retention, but in ‘real life’ the situation would likely be more complex which will have some effect on the outcome.

If the service has poor retention then the figures will go something like this:

10,000 customers, 500 new customers found every month, 5% churn, so 500 old customers lost every month. Total turnover $300K every month, steady state starting today, so no growth. Gross monthly profit is at $60K, maybe not enough to meet payroll so effectively this business is losing money month-to-month and they’ll need to up their game if they want to stay in business in the longer term.

For mediocre retention it would look like this:

10,000 customers, 500 new customers found every month, 2.5% churn, so when 2.5% = 500 customers we’ll reach steady state. It will take 20 months to get there, so about a year and a half. That’s at 20,000 customers, so eventually total turnover will be at $600K every month. Gross monthly profit is at $120K, probably enough to meet payroll, but not enough surplus to plow back into additional growth. Investment required is minimal, just enough to get through the next couple of months.

And for good retention it looks like this:

10,000 customers, 500 new customers found every month, 1% churn, so when 1% = 500 customers we’ll reach steady state. That’s at 50,000 customers, so total turnover $1.5M every month. Gross monthly profit is at $300K, more than enough to meet payroll, and enough to increase the marketing budget month-by-month to offset the first of the downward pressure factors, the fact that it takes more new users month-by-month to maintain the same %age in growth. Investment required is very small compared to the turnover already achieved, likely even a bank would be willing to take the risk.

The enormous impact of churn is clearly visible here, depending on the rest of the model the first business might not even be profitable, the second will probably stay afloat and the third, while investing the same in marketing/advertising as the first two will make a killing and if they plow a little bit of their turnover back into more marketing can possibly achieve exponential growth.

That is why retention is key. So if you want to hit one out of the park make sure you keep your customers.

We're heading Straight for AOL 2.0

Before ‘HTTP’, whenever a new kind of application was invented (say ‘file sharing’, or ‘address book’ or ‘messaging’) someone would sit down with a bunch of others and would discuss this problem at some length. Then they’d draft up a document describing the problem they intended to solve and the protocol layer they came up with to address this problem. That document would then be sent out to various parties that might have an interest in using this protocol who then would supply their feedback and as a result of that a more complete version of the original document would be released. And so on until the resulting protocol was considered mature enough for implementation. This process, centered around documents called ‘RFC’s is what got us IP (the internet protocol), TCP (the transport control protocol), HTTP (the world wide web), SMTP (email), the DNS (the domain name system), FTP (the file transfer protocol) and many other extremely useful building blocks for the modern internet.

The implementation of these protocols and their integration into applications was then left to the rest of the world, the standards body had - beyond maybe a reference implementation - no interest in that stage of the proceedings and certainly no commercial interest. Protocols came, were adopted and eventually replaced, either by something better or they died due to a lack of adoption.

And then something strange and for the most part un-expected happened. Where before all of the protocols were layered on top of the Transport Control Protocol (or UDP in some cases), or TCP in computer programmer lingo, a protocol was invented that was so successful that it in turn became a transport layer all by itself (not without associated problems). The reason for this is that instead of delivering software to the end users which then implemented this protocol using executables for the various platforms HTTP allowed to deliver both the visual part of the application (the user interface) and (eventually) the rest of the client portion of the application in one go. The existence of firewalls (which block off a lot of the ports otherwise accessible for peer-to-peer and client-server computing) further accelerated this to the point where instead of drafting RFCs for publicly available and open protocols companies now deliver one half of their application and some custom protocol over HTTP and never mind inter-operability with other services or playing nice.

The end result of all that is that we’re rapidly moving from an internet where computers are ‘peers’ (equals) to one where there are consumers and ‘data owners’, silos of end user data that work as hard as they can to stop you from communicating with other, similar silos.

Imagine an internet where every other protocol except for the most closely related to ‘plumbing’ ones (TCP/IP/UDP/DNS) are no longer open but closed. That may sound far-fetched but even though the number of RFCs is still growing the last RFC with an article in the wikipedia list of rfcs is the iCalendar Specification (RFC 5545) and it dates from 2009. Since then there has been a lot of movement on the web application front but none of those has resulted in an open protocol for more than one vendor (or open source projects) to implement. One explanation is that we now have all the protocols that we need, another is that more and more protocols are layered on top of HTTP in a much more proprietary manner.

This is a dangerous development, the end-game of which is an internet that is about as closed as it could get by removing all the interoperability and replacing it with custom and incompatible protocols over HTTP, maybe with the occasional server talking to another server in the background.

Email will probably be the last to go, when the last user of it finally gives up and moves to gmail so they can continue to communicate with their contacts or maybe they give up entirely. RSS (an open content syndication protocol on top of HTTP, which I think is a nice way to illustrate that it is possible to use HTTP as a layer and play nice at the same time) is already an endangered species, XMPP support is slowly but surely being removed (just imagine a phone system where every number you call to may require a different telephone), NNTP has been ‘mostly dead’ for years (though it still has some use the real replacement of usenet for discussion purposes appears to be Reddit and mailinglists) and so on. The only protocols that are developed nowadays that are open are typically related to plumbing (moving bits of data around), not application level protocols which determine how a whole class of applications around a similar theme can talk to each other.

The biggest internet players count users as their users, not users in general. Interoperability is a detriment to such plays for dominancy. So there are clear financial incentives to move away from a more open and decentralized internet to one that is much more centralized. Facebook would like its users to see Facebook as ‘the internet’ and Google wouldn’t mind it if their users did the same thing and so on. It’s their users after all. But users are not to be owned by any one company and the whole power of the internet and the world wide web is that it’s peer to peer, in principle all computers connected to it are each others equals, servers one moment, clients the next.

If the current trend persists we’re heading straight for AOL 2.0, only now with a slick user interface, a couple more features and more users. I personally had higher hopes for the world wide web when it launched. Wouldn’t it be ironic if it turned out that the end-run the WWW did around AOL because it was the WWW was open and inclusive ended up with different players simply re-implementing the AOL we already had and that we got rid of because it was not the full internet.

So, if you’re going to design a webapp and you wish to help revert this trend (assuming that is still possible): Please open up your protocols, commit to keeping them open and publish a specification. And please never do what twitter did (start open, then close as soon as you gain traction).

The Fastest Blog In The World

Update: James Hague, aka Dadgum was inspired by this to do some work on his blog and I’m happy to report that now his blog is the fastest blog in the world, 4K transferred and 1 request per page, load time < 75 ms. If you have a blog that’s even faster than that let me know.

– Read on for the original post –

I positively hate bloat in all its forms. Take this BBC News Article, it’s 2300 bytes but it loads 1.2 million bytes of data. That’s more than a megabyte for what probably should not be more than several tens of kilobytes. (edit: this used the google homepage as an example before which was a poor choice because the google homepage does a lot under the hood that is not visible to the user, though personally I actually liked the really simple old page.)

Bloat to me exemplifies the wastefulness of our nature, consuming more than we should of the resources that are available to us. A typical blog post on most blogging platforms will (even if the blog post itself is just a few kilobytes of text) load an easy megabyte. The words themselves are usually less than a few kilobytes even for the largest posts. Imagine an envelope for a letter that weighed a couple of pounds for a 1 gram letter!

So, when the time came to finally attack the issue of slow re-generation of these pages when I was using ‘octopress’ I decided to not only upgrade the blogging engine (to ‘hugo’, a lightning fast static site generator that is very easy to install), but also to strip the blog of anything and everything that did not matter without impacting functionality. The blog had to look exactly like it did before, work exactly like it worked before and it had to work on both regular browsers and mobile platforms.

Mobile matters a lot these days and I think that when a large chunk of your readers sits on metered bandwidth you can do them an easy favor by making sure that they don’t download more than they have to, it saves them both money and time.

This took a bit of doing, but I’m pretty happy with the end result, the ratio of data pushed to the user for a single page is 20:1 for old versus new, the ratio of wrapper:content is now 5:1, before it was a whopping 100:1! This particular article is about 5000 bytes in its original un-rendered form, the server has transferred about 13000 bytes to your computer fully ‘wrapped’ in HTML, with CSS and so on. That’s about 3:1, which isn’t all that bad. (You can verify this yourself using firefox by pressing shift-ctrl-Q and then reloading the page, that’s a pretty useful tool in determining what gets sent to load a page.)

The steps I took to get rid of the bloat are:

  • inlined the few images that are still left

  • inlined the stylesheet (there is a cache penalty here so you have to trim it down as much as possible but the page starts rendering immediately which is a huge gain at the cost of a little bit of extra data transferred, all the measurement tools I’ve used seem to agree on this)

  • got rid of most of the CSS rules that weren’t used

  • got rid of allmost all javascript (jquery, various plug-ins, analytics)

  • got rid of external fonts (the slightly nicer look is not worth the extra download and delay)

  • replaced twitter plug ins for ‘latest tweets’ and ‘twitter button’ with static content

  • reduced the number of resources loaded from the server to render the page to 1 (the page itself)

The end result is pretty lean-and-mean. And all of that change barely affected the look or functionality of the site for a user, the difference is really minimal. So on all pages that do not contain images (and that’s most of them) the page is one single request. That’s it. No css, no javascripts, no fonts, no images loaded from the server. The pages load < 20 kilobytes from the server on average (compressed), they load in under 150 milliseconds from start to finish and they render in less than 200 milliseconds. Clicking around on the pages in this blog should be instantaneous and should never result in having to wait for the next page to load, it should look to your eye as if you already had the page in your local cache. The embedding of the stylesheet in particular was a good move, it dramatically reduced the time required to render the page because it doesn’t require the loading of an extra resource before the rendering engine can fire up. The overhead from sending the CSS data multiple times when multiple pages are loaded is definitely not fantastic but by pruning down the CSS that overhead by itself was reduced by a factor of 4 or so.

I’m sure I can do better still, for instance the CSS block is still quite large (too many rules, not minified yet) but it can be quite hard to figure out which rules can be lost and which are essential (nice idea for a browser plug in, take the css loaded by the page and remove all unused rules). Even so, the difference between what was there before (600+K, 18 requests) and what is there now (< 20K, 1 request) is so large that any further improvements are unlikely to move the needle. Optimizing a thing like this is likely a bad investment in time but it is hard to stop doing a thing like this if you’re enjoying it and I really liked the feeling of seeing the numbers improve and the wait time go down. This is a nice example of ‘premature optimization’ but I do hope that the users of the blog like the end result.

If you know of a blog or have one that loads faster than this one or uses tricks I’m not aware of I’d like to hear about it!