Let me start by saying that, the math in most programming languages is WRONG. In most cases, there is little to be done about this, except perhaps tack on some new classes which allow new code to be written to fix it.
However, nascent languages, like Rust, which I’ll mainly address here, have an opportunity to do better. I believe they SHOULD do better, and so I’m taking a little time, into the small hours, to attempt to make the case. Let’s get started then, bearing in mind that, in the small hours, this won’t be as well-edited, or even well-reasoned, as one might hope ;)
I just saw yet another interesting blog post that I wanted to contribute a comment to, but bypassed it, because it uses the horrible Disqus comment system. So, I want to take a moment to
rant helpfully point out that using Disqus or third-party comment systems like it will probably do your site (and the internet as a whole) more harm than good.
I’ve been studying a lot of languages lately, looking for a modern, parallel language which is a deserving successor to C/C++/Python as my main, go-to language for Getting Things Done. Someone on Reddit was specifically asking why C++ isn’t ideal any more, and I wrote up this response based on my research. So, I thought I’d repost it here, too.
Let’s get right into what’s wrong with C++, then, and why I’d sooner replace it.
I’ve recently taken a few projects out of storage (literally: the mountpoint is called “storage” :), and posted them on GitHub. So, I thought I’d take a moment to point out each one. I’m also going to get into some current projects, which will be appearing on GitHub one of these days.
Someone asked on reddit:
“How lines of text (coding) can turn into beautiful, complex environments in games?”
I wrote up a quick reply, which I liked enough to post here instead. I should really take the time to write it more carefully, building the explanations, evidence, etc. That could be a powerful read, or maybe even a programmer’s manifesto of sorts ;) But I’m lazy, and I sometimes enjoy the quick flow of prose as much as the precision of slowly, carefully spelling out details. So, I’m going to leave it like this, for now :)
There are hand-drawn graphics etc., but I don’t think that’s what you’re getting at.
The kind of beautiful complex you’re talking about is exactly what the beauty and joy of programming is about: creating beautiful, complete worlds, using the simplest, most elegant models you can think of. It’s really a lot like how the real world works: animals, plants, etc., all come from very simple DNA codes. Proteins etc. are built from those simple codes, being processed according to simple rules. This happens billions of times. Those created proteins etc. form cells in accordance with other simple rules, and those cells react to light, gravity, water, etc. according to other rules. Organs interact according to their rules, people react according to their rules and their own unique attributes (just like earlier, proteins were built according to the unique attributes of DNA)…So, on and on, we have this ever increasing complexity, built from very simple principles acting on other simple principles and individual variables.
Eventually, you reach a point where complex organisms feel the sun on their skin, smile, grab a girl’s hand, and go running through a field, feeling the wind on their face, turning to see the stars in another’s eyes, etc.
Beautiful complexity, from simple elegance. It’s art. It’s poetry. It’s engineering. It’s a new form of art, and yet it’s the same art humans have made, following the same rules except with different variables, since they first scratched on the wall of a cave.
You might want to read up on complex systems / complexity theory / chaos theory, genetic algorithms, conway’s game of life, and similar topics for more on this.
Or just go look at a bridge ;)
A lot of people run into a problem when they first setup a webserver — perhaps in a VPS or other small system — using default settings: the system will run fine at first, but when under load, probably when the owner isn’t around, it will grind to a halt. The owner comes back, tries to use the site, and it’s unbearably slow. Even logging into the server to see what’s wrong, can be unbearably slow. The worst thing, but the common thing, is to log in eventually, reboot the server, and continue as before.
The usual cause of this is not realising that web / database servers need to be configured carefully for the memory use per connection.
For example, if you’re just running apache with mod-php, and PHP is configured for 128MB per connection, apache might default to 32 connections or more (it might even be 128 by default), which is a LOT of memory for a small VPS. Then there is your database on top, and all of its connections, caches, etc.
For a webserver, you should probably:
What you need to aim for is for the maximum number of processes, under full load, and full memory usage, plus any extra software like firewalls and cron jobs, to never exceed physical memory. If you do need to run heavy cron jobs etc., then enable virtual memory, but only if you’re sure there are quiet times for your server, like 3am, when it can afford to crawl. Otherwise, you need to take the hit and reduce the number of processes/maximum memory, or increase the server memory, to cope without swapping.
Officially, the UK Inflation rate is 2.4%. Except that inflation is supposed to be “a measure of the rise in cost of goods and services“.
Based on actual figures from the Office of National Statistics, the same organisation that reports the inflation rate, things are very different:
|Gas and electricity:||+142%|
|Car tax & insurance||+108%|
|Council tax and rates||+49%|
|(and so on)|
The two are highly out of sync. On the one hand, you have huge actual rises in the cost of living. On the other hand, you have methodologies that make everything look right as rain. In fact, they have the audacity to claim that interest rates are falling, to 2.4%.
When inflation gets really bad (when hyperinflation occurs), the end result is that the economy collapses, or has to take extreme measures known as to prevent collapse. We’re not there yet, but this should be starting to sound familiar.
And yet, they have the audacity to claim that interest rates are falling, to 2.4%.
This is how you know your country is slipping… not into recession, but into global irrelevance.
As Her Majesty might say, one would do well to learn Chinese soon.
Following on from part 1 of this article, I’d like to take you through some PyPy vs. CPython benchmarks.
Note: in all benchmarks, I’m measuring total memory use for the entire interpreter run, but only measuring time taken across the code I’m actually interested in testing. There’s a subtle (depending on your experience) difference, because the interpreters may take longer to start up and shut down before actually running. From my perspective, it seems wiser to ignore this startup time, since most applications may run for a long time, and users are interested in performance once running, rather than during startup/shutdown. Either way, the startup/shutdown time is neglible for both interpreters, in this case.
Let’s take a quick benchmark of CPython vs. PyPy. This simple script will take a number on the command line, and generate that number of objects, each containing a dictionary of 30 sub-objects, which in turn contain simple string values as fields.
I get a lot of use out of PyPy. In fact, it’s become my default python interpreter, replacing CPython, at least for Python 2.x code. Python 3.x support in PyPy is coming real soon now; most of the tests are passing, so the next release will probably make it happen.
So, I wanted to write a little about the virtues of PyPy, and its potential to be your default Python interpreter, too. I also want to talk about the main issues that might present roadblocks at the moment, and how you might work around them. Finally, while most benchmarks focus on PyPy’s speed, I’m also going to examine its memory usage.