twisterghost

Last Login:
June 24, 2017
Warn:

Rank:
Moderator



User Profile
Follow

Hits: 161,008
Joined August 06, 2005
Games (19)

Elicti (Old version)
February 08, 2006
LOLOMGWTFBBQ
June 26, 2006
64Buddy
April 17, 2006
ludaBAD
April 20, 2006
NoobRageX
April 25, 2006
HummingBird
April 26, 2006
Elicti-Version 1.1.3
June 15, 2006
LOLOMGWTFBBTWO
July 13, 2006
SWP 1.5
July 31, 2006
Total Pwnage
August 21, 2006
Elicti: Patch 1.1.8
September 17, 2006
LOLOMGWTFBBQ3
August 25, 2006
AzNventure!
September 01, 2006
AzNventure2!
November 02, 2006
Spinning Squares
December 02, 2006
Elicti: 2.0.1
December 10, 2006
LOLOMGWTFBBQ4
December 25, 2006
King Cuko's Lost Gold
February 07, 2007
Kevin's Quest
January 01, 2007
Favorite Users


Rethinking "fast"
Posted on June 15, 2017 at 09:14

A huge focus of mine at work over the past few months has been performance. Performance, in the sense of making sure our product is running not just well enough but truly well.

Unfortunately, that isn't where our product is, but we can get there.

Performance is something that basically has to be a given these days, especially regarding the web. On average, the majority of people will navigate away from your website if it doesn't have a meaningful paint by 3 seconds. Given a halfway decent internet connection, that should not be a problem.

Unfortunately, as projects grow, deadlines arise and corners are cut, performance and forward thinking falls to the wayside in favor of getting stuff out quickly. That, or misconceptions about the "right way" to do things appear.

We've seen this first hand with our product, which is a huge React application. Now, React is fast - but what isn't fast is when you have thousands of components on your page, each doing heavy lifting every time anything updates. We adopted React very early on, before there was a lot of reading about the right way to do things. Because of that, our application does heavy lifting all the way down, so if you need to render something with 10 layers of children, suddenly it chuuuugs.

Anyway, this kind of thinking has driven a lot of my development recently. For one, I found a way of developing and hosting the Evenfall website to keep it quick, super lightweight, and highly available. Granted, it is a small, static website, but I chose that on purpose to allow for this to happen.

The website is built with Metalsmith, a build tool for static websites that is entirely plugin-based. I have just as many plugins as I need, and nothing more. The site builds in about 1.5 seconds, and most of that time is spent transforming the favicon into all formats. Without the favicon transform, it goes down to under 1 second.

The output is then just served from an S3 bucket on AWS, with a CloudFront CDN distributing it. All of it was easy to set up, and serves this website over https, quickly and efficiently for a few dollars a month, at most.

I also apply this mindset when working on the game itself, but I will go into that in another post. Jonathan Blow did a great talk about speed recently, and in it, he used Photoshop as an example: It takes photoshop several seconds to open the "open file" dialog on his beefy laptop. Now, if computers have gotten so much faster, why does Photoshop still take several seconds to open this dialog, when a computer that was orders of magnitude slower took the same amount of time to open the same dialog years ago?

The answer is bloat, for sure, but it makes you think. Virtually nothing in a standard workflow on a powerful machine these days should take more than a second unless it is doing serious computational effort. Like compiling, for example.

Except, woops, this talk is also about his programming language, Jai, of which he shows the compiler compiling and linking a 50k+ line program in 0.4 seconds.

Anyway, its a good talk: (audio starts at about 1:15)



~tg


I figure the biggest reason speed has been sacrificed over time is that there's a large opportunity cost to make a program more streamlined and efficient. Further, the customer demands features over speed. This creates a feedback loop, as customers slowly become used to the "sluggish" speeds (remember, speed is pretty much only something you notice at first but quickly get used to it and accept it as your new norm) which means it's less important for customers to have a faster product rather than a more feature-rich product.

Frankly, it was only until Chrome became unbearably slow and bloated (taking several seconds to start up) that I finally swapped to Opera and haven't looked back. But I probably spent years with a product that, today, I would find unacceptably slow in comparison to what I'm using. The interesting bit comes from the fact that I was a day one Chrome adoptee, because I was sick and tired of how slow modern browsers were at the time. But slowly, it became more and more bloated with its speed imperceptibly reducing until it became unusably slow.

I figure software is the same way. There just isn't an overwhelming demand for efficient and fast programs even though we all love it, because for the most part we've accepted that programs *should* take a certain amount of time to load or perform certain actions. We want more features, not the same but faster, and so the market accommodates these demands.
Posted by Cesar June 19, 2017 17:36 - 5 days ago
| [#2]

Recent Activity
 
Active Users (0)