Tuesday, August 19, 2014

Oxygen concentrations


This is something I have been meaning to mention for a while but don't think I have done... been sitting on it as part of the idea for an SF storyline, but fuck it - I have more of those than I could use in a lifetime of writing, ideas are not my problem.

Something we take for granted is combustion - simply apply heat to a wide variety of things like wood or coal or oil and they'll happily burn. But do you realise that if the concentration of oxygen in the atmosphere was only slightly lower this would not be true? It's far more critical than you might think, a reduction of only a couple of percent is enough to stop combustion... But life would still be perfectly possible, humans can survive perfectly happily at oxygen concentrations far below those necessary to support combustion.

So what? I hear you think... well, the thing is that the oxygen concentration in our atmosphere has varied quite widely and could very easily have been lower than it is for the whole of human existence - now, if that had been the case then fire would not work... without fire we could not have developed any form of technology... no extracting metals from ores, no alchemy leading to chemistry, no glass, no ceramics, no steam power...

What a terrible trap lurks for intelligent beings who happen to evolve on a planet with such an atmosphere.


Friday, June 20, 2014

BASIC


[amusement] It occurred to me during a conversation that BASIC was a language designed for beginners to programming and used to be regarded as trivial to use; but it's now regarded by modern programmers as far too complicated for them to do anything with...

One wonders where this process will end. When the final "programmer" becomes unable to operate the single switch required to turn the computer on, I suspect.

Wednesday, May 21, 2014

Software blues.


Just done a little light reading for relaxation - a guide to optimising x86 assembler code... it is both amazing and appalling how much hardware there is inside these poor processors to try to deal with the awful code they're given to run by compilers.

It's insane. Rather than have programmers write competent code - which, I keep having to remind myself, is actually quite easy - we've ended up in a world where nearly all programmers write complete garbage that includes masses of other complete garbage that other equally-incompetent garbage-mongers have gathered together, and all this offensively incompetent shite gets compiled to a vast swamp of instructions that are then examined by the hardware, broken up into smaller micro instructions and re-ordered into some kind of sense before actually getting executed... but no matter how impressive that hardware is - there are typically tens of billions of examples of this happening in each and every computer in every second - it doesn't make the whole ridiculous process any less inefficient... and all this hardware and furious activity burns electricity and generates waste heat.

Feel your computer - is it warm? hot? Unless you're doing something that actually needs lots of computation (3D games count) that heat is almost entirely generated by programmers being stupid.

Why is this all so inefficient? Mainly because the languages and tools programmers prefer to use are chosen *not* by any logical, rational process but by the whims of fucking fashion by programmers with no idea of what is actually going on under the hood and no interest in finding out.

And they'll tell you this doesn't matter - hell! they tell *me* it doesn't matter and I know exactly how full of shit they are - but it matters to the extent that at the present moment the IT industry accounts for about 2% of all the energy used in the world. That's about the same as the entire aviation industry.

Think about that for a second... it should be such a small fraction that it can't be bloody measured easily. Instead it's a major source of wasted energy and pollution... and this energy isn't used because it needs to be used; not at all. This energy is almost entirely wasted, and it's wasted because programmers are so fucking useless. And they're not getting more efficient either, quite the reverse. All the time they find new ways to add layers of inefficiency to everything.

Programming constantly moves away from energy efficiency towards less efficient languages and tools. Programmers actively choose not to use efficient languages and expect everyone to buy faster and faster computers to keep up with the growth of their inefficiency. But this is hardly new, it was old news thirty years ago... for most of my life I've watched them at it with a kind of shocked disbelief... knowing how to write efficient software easily leaves you wondering what the hell it is about getting things wrong that other programmers find so appealing. It's harder to do it badly.

Reading this status has probably used about a million times as much energy as it should have done, and would have done if all the programmers involved in writing the code involved were skilled... the sad fact is most programmers aren't just bad but completely fucking clueless. So monumentally incompetent that there is no way anyone outside the IT industry will ever understand how fucking thick and clueless they are - most of you just don't have any experience of anything being done as badly as programmers do things. They make cowboy builders look like paragons of efficient competence. Saying they build structures like Heath-Robinson contraptions is ridiculously understating the absurdity of their habits.

Ah, fuck it. This is pointless... It's our energy and money they're wasting, and it's our planet they're raping, but nothing I say is going to stop that or make even a single one of the lazy bastards change their ways or admit to their crimes. And make no mistake - they are crimes. I just hope that one day they're recognised as such and the software industry becomes accountable for the energy they waste.

Now, programmers will be reading this and bursting with asinine remarks in defence of bloat and crud - understand clearly that I don't give a shit what miserable justifications you present; I've heard them all, and shown them all as the tripe they are countless times... what I'm left with is the sense of utter shame that people outside the IT industry confuse what I do with what you do.

Wednesday, April 30, 2014

Mice...


Given that mice are affected by the scent of male researchers, I wonder if there's a market for things that clip onto their cages in labs and releases that male scent over a long period to acclimatise the little buggers.

I'd look into patenting the idea and making these if I had time... someone get on with it, I want 5%...

Friday, April 11, 2014

Help getting it up...


Dear computing agony aunt... I have an embarrassing problem... does size matter? I know everyone says it doesn't, but are they just being kind?

It's... it's hard to talk about this... but I feel embarrassed whenever programmers start talking about the size of their sources. I just can't compete. I can't show anyone my printouts, they're just so weedy. I have to hide them inside the covers of other people's sources... when it comes to lines of code, I'm absolutely convinced modern programmers know how to write far more of them than I do. And not just twice as many, or ten times, but thousands or even millions of times as many... I'm not imagining this! I'n not! I'm not! I need help bloating!


I mean... I've just added up all the lines of code in my wireless sensor network, a big project with operating systems, networks, applications, servers, all sorts of horrible things, dozens of different processors, and I had to add in the sources for the tools I used to build it all to even scrape into the ten thousand line bracket... I wrote half of it in assembler, you can't say I'm not trying! What more is a man to do? My bugs got tired of complaining that they had nowhere to hide and left me.


I try, I really do, but they're always talking about simple things that they've needed millions of lines of code to do, and I always run out of project before I reach even ten thousand lines... I add things and put bells and whistles and all sorts of unnecessary options in, and they only add a few hundred lines at best. Maybe a thousand if I'm really verbose and hide a game or completely different application in there somewhere... It's a worry. What am I doing wrong? Please help... I feel so lonely and out of touch.


(minimalist from Chester, 52)


PS. I was told to try C++ or Java, but... I still can't seem to manage it.


It's not fair... I'll never be able to survive by charging by the line. [sniffle]

Thursday, April 10, 2014

Modern programming...


What I would like - hell! What I would absolutely fucking LOVE is for software products to have to include an ingredients list... if it uses open source garbage it should say so on the packaging in a great big warning label. The proportion of the source-code that the "author" hasn't even fecking well looked at, let alone understood, should be stated so that people have an idea how little the vendor knows about what they're selling.

It really REALLY pisses me off that there's no public distinction between the software written by people who have a clue what they're doing and the integrity not to use random garbage and the utter tripe that's thrown together by mindless, clueless garbage collectors, who go bin-diving in the open-source sewer and pile up sludge until they have enough to call it a product... except that image is far too kind for what clueless modern programmers actually do; there are really no ways to describe or conceptualise the extent of their miserable incompetence.

It sickens me to be thought of as belonging to the same profession as modern programmers. It utterly, utterly sickens me.

[later, after watching more news about the latest exploit found in open-sewer security software]

Whenever I see some turd of a programmer extolling the "virtues" of open-source software, using the lie that "anyone can see the source so it gets reviewed", as if many of the modern programming crowd have the competence to judge anything or anyone has the time to wade through the hundreds of millions of lines of code that get dragged into every project these days, I want to kick the lying shit out of them.

[muses] It would be unfair (and inaccurate) not to acknowledge that there are some very skilled programmers involved with writing open-source code, and most of the people who do it have good intentions. But the problem is that, like a chain, the weakest link is what determines the overall quality of any project, not the strongest, or even the average.

Writing reliable code involves far more than enthusiasm and the willingness to become involved; it's a real skill, it takes time to acquire and not many people have the right mindset, patience and consistency to do it even if they have the willingness to try, and few do - there's far less glamour in carefully writing a solid reliable application than there is in dashing something QAD (quick and dirty) out and then scampering on to something new and exciting, leaving behind a trail of half-finished projects for others to clean up.

But never mind... I have discussed this subject many times over the years and to do it justice requires more time and effort than I'm prepared to spend on it on facebook; I could and probably should write books on the subject. It's hard enough to get people to understand how to write code that can be tested, let alone expect them to try to test code that wasn't written with testing in mind by someone else who doesn't understand the process, who has a different coding style and skill level and like as not was introducing new problems while trying to deal with others and working on a completely different hardware platform... so the oft-mentioned concept of open-source generating reliability by peer-review and the process of bug detection/removal is childishly, painfully naive. 

I predicted over twenty years ago that we should expect software to stop behaving like an all or nothing digital system that either works or doesn't, but instead to experience a combinatorial explosion of unreliability until computing environments behave in an analog fashion with multiple degrees of failure of varying severity following an exponential decay curve slowly approaching - but never reaching - a stable state. I pretty much nailed that one, unfortunately.

Monday, February 03, 2014

From an old grumble of mine. I doubt I'd change a single word:


The complexity of any given software project may informally be judged as proportional to the number of requirements the design must satisfy raised to the power of the number of programmers involved, and the likely reliability as inversely proportional to that complexity raised to some power greater than unity.

(It should be understood that the number of programmers used here is not limited to those actively engaged in directly writing code for the project, but must also include, in some suitably scaled fashion, those who "contribute" to any libraries, objects, tools or operating systems involved. I usually describe this process as "The more, the messier")

In practical terms the consequence of this is that the only chance there ever is of producing reliable code is to reduce the number of requirements for each project and use small teams or competent individuals who are responsible for designing the entire system. The history of computing and indeed engineering in general is full of examples of successful designs using this strategy, and also of alternative large-scale design processes that consistently fail, overrun deadlines and eventually produce bloated unreliable garbage when they produce anything at all.

I have been asked "if that is really the case, what is the solution?" a question I regard as fundamentally flawed. There is no short-term solvable problem here; it's a limitation of human intelligence and the overheads of communication. They're hard limits, and any software project that cannot be divided into fractions which are within the capability of of a single competent individual should be expected to be (a) unreliable and (b) delivered late if delivered at all...

In the long run the problem may be solved by the development of AI and the arrival of more competent individuals/reliable communication; given the intractable nature of intelligence and the inherent inefficiency of the software development required to produce them I would not expect this development to occur for several decades.