Reaching the limits

It is very often by reaching the limits when a system breaks. This can happen unexpectedly, leaving us in a state of not knowing why something happened the way it did. Yet, we are not always in control of these limits. But at the same time, it is these limits that make developing websites, programs and software so interesting, even fascinating.

Recently, I performed a routine backup, moving only the most recent batch of new files to my external hard drive. What I didn’t realize though is that this time I would get an error message that would make me rethink how I handle this. I tried a couple of times and got two different error messages: the one said that I didn’t have the necessary permissions to copy the file and the other said simply that the file or directory can’t be created. I was baffled, not knowing why this is the case. Simply by reading these messages I couldn’t tell immediately what was wrong. Something with the hard disk maybe? Fortunately, they gave me the same error code, 0x80070052, whatever that meant. It is this non-descriptive code that brought me to online forums and made me realize what the reason might be. Initially, when I started using the drive, I was fully aware how it is formatted (FAT32, as most external ones) and was also aware of the publicly stated limits: you can’t copy files over 4GB and you can’t have more files than approx. 65000 per directory. For me these limits were okay, since I don’t store heavy media and I wasn’t expecting to reach 65000 files in a single directory. However, I still like to keep many files in a single directory and I also like labeling them appropriately with more keywords, so I can find them more easily later. Since the average length of the names of my files was longer than the average length with which FAT32 was tested that it can contain 65000 files/directory, I was getting an error message with just a bit over 10000 files. To verify that the problem wasn’t the disk, but its file system, I tried to move a single test file around. This worked fine as long as I wasn’t copying this file to the largest directory. Bingo. I’ve never reached the limit of a file system before, so this behavior was strange and new to me. Sharing this here might help other people to reconsider whether they should use FAT32 if they plan to label large collections of files. As soon as I formatted the drive to NTFS, all files were copied.

Another limit I reached – 552 Disk full - please upload later, Error: Critical file transfer error. This happened when I tried to upload a file on this website. As a result, the file was only partially uploaded, which surprisingly to me, corrupted it to a file size of 0 bytes instead of reverting the upload. Since it was Saturday, support wasn’t available to fix this. I needed to test with 2 different FTP programs to understand what happened. TotalCommander gave me the message "Transfer failed, continue operation?", which in this case was not particularly descriptive. FileZilla produced the above message, which at least gave me a clue. Although such messages can be relatively rare, noone is insured that they can’t happen. In this case, all we can do is to remain calm and rational.

Trying to use too many DOM nodes quickly reaches the limits of the browser and causes it to crash. But programmers don’t get a message why this happens, and so they might think that their loop has some extreme asymptotic complexity. Many people would say that if something isn’t technologically possible, then doing it is probably wrong. This is correct because we should generally strive to use as few resources as possible; it is also incorrect, because there are useful exceptions to every rule. By limiting the DOM, we have practically limited what is possible on the web. Even when we can have many elements, we should still examine how fast we can do something useful with them. The idea to insert DOM elements asynchronously is an interesting one, but from the demos I have seen, performance improvements can vary among browsers. For instance, on my machine Chrome performance improved a lot, Firefox only slightly, and IE improved in a way that was still far from usable.

And if a limit isn’t browser-related, it can be hardware-related, network related or something else. The problem we are trying to solve could be intractable or NP-complete, not to say that probably the majority of them are. In a sense, creating something then requires us to navigate around these limits. It’s often much easier not to try to lift them until they become both too limiting and personal.

Want to use recursion? "Too much recursion" or "call stack limit reached". Animation? 10fps. CSS gradients? Prefixes, please. Browser history? Try, if you have the nerves not to revert to adding hashes to the URL. JavaScript? "You use jQuery instead". Mousemove/scroll events? "Ok, but eliminate everything inside of them". PHP template engines? Only if "{$variable}" couldn’t act as a placeholder in native PHP. Ajax with Python or something like page redirects? "You use Django or Flask for that." Numpy/SciPy? Nice, but N/A on your hosting. Haskell? "We don’t have for loops, but we can offer you folding if you ensure that you can use it within the limits of your RAM". Plotting in R? "Prefer ggplot2". More images and video? If the CPU/GPU can handle them. More HTTP requests? Each round trip costs you at 300-500ms, choose how many you like. Base64 encoding for images/fonts? If you don’t read your code that often. Make the font bigger? Only if there’s enough screen space for the other elements. Can you tell the statistics? "With or without the 80% robots?"

As you can see, every technology we have today is in some way limited. Visible symptoms lead to the immediate desire to prescribe pills that only make things worse over time. We have locked ourselves in a prison of keywords, patterns and tactics. These keywords serve more as ego-boosters than anything else. If everyone kept to their own choices, how could we work together in the future? Would we create a bigger community or thousands of very small, maybe even single-person communities? If we aren’t interested in the limits we create, but only seek to invent the most flashy keyword in a world that has literally run out of them, how could this translate to a future of more opportunities? Do we still need to exaggerate how great things are, when they aren’t? Aren’t we the ones that will be most affected by this at the end? What happened with the puzzle?

In a puzzle the right piece always fits perfectly in the right moment. When we know that we have a single picture and a single possible interface, we can clearly see where this piece can be added. What we have done is to create duplicate pieces with the same imagery, but with slightly different interfaces. They look and feel the same, but things now work slightly differently and it becomes much harder to see when a piece could fit well. Piece A overlaps the functionality of piece B and does many things better, while being worse at some. Still, A might not fit to C as well as B could. This leads me to my point: We have spread the limits into a chain of excuse. The alternative could have been to work together and see where the limits of each component are, fix them at the right place and only work from there. But this requires us to let go of our keywords.

For me, "Don’t reinvent the wheel!" is right because we don’t need more things than absolutely needed and wrong when the wheel is left to spin to no purpose in the clink of limits we’ve created.

bit.ly/1jHBrFa