Fixed size data types

l recently rediscovered as a quite useful source of stuff. Since I knew that data type size isn't guaranteed if you change the compiler or platform, I was unhappy about the fact that I didn't exactly know whether variable x will really exceed this value earlier or later. While I was using SDL I always used fixed-size types, but that changed when I lost my interesting in continuing video game development. And now I sitting here with the same problem in mind and haven't yet done any piece of ITK with datatypes different from void* or size_t due a number or reasons. I've go quite a bit more experience since then and think that the only datatype sizes that matter in not platform-specific code is the size of a word or int and a fixed size you use for especially memory-saving things or definitely portable types.

And geez, I've actually NEVER used anything except words und the hopefully-one-byte-big char types as well as their unsigned counterparts. You know what? ITK will simply never used any flex-size types other than a word (better known as int) and char (for string compatibility reasons). this effectively eliminates all problems with them and I can simplify my API a bit more.

Rejoice! Bool save the bit.


Chosen language of choice, thought to form

As I was watching a wonderful BBC film version of Sherlock Holmes and the Case of the Silk Stocking, I noticed how much more expressive and pleasing the English language is for me compared to German, my native language. I'm sure it is due to me liking a difference in act and person between myself and others during the time I went to school. It's the time when my inner nerd tries to get out on every occasion, bringing himself as far as he could. Over time my view changed, but I still didn't like to express myself in simple German or German itself. The more my English improves and the more I fluent in forming everything in a totally non-native language, the more notice how actually... different, and new, it feels to form and think in English. It's a question of personal taste and choice as well as availability. Most people I know don't think or talk in English. Even with years of teaching, they tend to leave it as technical and fundamental as they learned. I, on the side, am still writing myself a quite comprehensive collection of inner ramblings and rants using this blog. I can reinvent myself, leaving all annoying German things behind. If I start watching movies in English, reading and writing the stuff I like using it, too... Just think about all the stuff you just don't want to have in your nice, small world. Problems of the world right infront of you: laws, money, stupid people, studies you have to do, people you'd rather not like to know cause they rage the crap out of you - all this stuff is gone when I start to think, write and argue in English. I love to hear the sound of something different, stuff that's not the usual German trash but understandable, too. There are so many things waiting inside of me: dreams, ideas, whole dialogs and whatever not I'd never want to express in German. Especially things related to love and deep emotions are just plainly disgusting to pronounce in German. Maybe this has to do with the fact that I just don't want to have to do anything with stuff that's around me from my birth on. I still life here where I was born cause I don't have the money to get an own flat, I still have to talk to everone else in this language made to chop wood and scrap metal and I'm still stuck the persons there were always present in my whole life til this very moment. I love completely everything to begin something new. I often completely reorder my room, delete all savegames of a video game, setup new operating systems and buy some new, colorful bedclothes from time to time. Choosing a different language to express myself better is one step closer to the moment I'm starving for since I first thought of it. This big change is, unfortunately, a very slow process as well as planned life changes in general. The moment I realized that I'm gay may be this first time I started to want a change, a total change turning every how it was and how it would become. Yes, I think it began there. Atleast something deep inside me made click. I can only imagine what could it be to have this change as well as how it lifes, but I doubt I'll realize that it's there when it actually happened, like always...


BSD is not as Linux as I thought

I always thought of those BSD operating systems as total niche products that run on a few odd machines without any support for modern stuff or even desktop PCs with modern graphics cards and so on. But sheesh, I was so damn wrong. I new nothing, not more - even less! Wikipedia does again disabuse me and my uneducated mind. It seems that almost all sources can be ported to BSD-derivated operating systems, though a compilation might be necessary to make available for your system. This makes me once again wonder why the fuck I don't get to know of these things via my university. We only get a few Linux basics, even less for Windows and the Rest is a shitload of bad Java frameworks and some simple algorithmic things. I feel a bit lost now. There are so many MORE interesting out there than just the tripel of Linux, Windows and Mac OS. And again I feel like I want to write my own operating system cause everyone else can do it so, too. Of course is that wrong. Of course does all this come from a few almost never-changing things and of course are there only changes and additions done to Unix, not the other way. I often forget that much of the stuff running on my PCs is a simple but highly specialized software system made of many, many, many different codes from even more different people. A huge pot of soup bubbling inside-out if you ask me. So what I should know for now is how to use the operating systems functionalities and not more. I still consider myself a programmer wanting to archieve his task very well in a priorized order of functionalities. Not a operating system writer, administrator or something else deeply connected to the hardware in some way. What I can think of is using a hardware directly using a driver-accessing library or directly using a port and a protocol. So, a strict line between me and too sytem-near operating. And I'm somehow glad I didn't choose a study path close to hardware or system administration. I can totally think of the idea of having a single piece of hardware, a set of driver function doing stuff with them or just a normal OS with drivers. That's absolutely graspable and within my perceptual reach. But the more it becomes that complicated and hardware-connected... my mind shuts off and I lonely, lost and desperate. Maybe that's the real reason why I also can't cope with electronics and anything physics or chemistry related. Too far aways from logic or perceptual, step-based and visually connectable reactions. Oh man, when will this ever end. I just hope that I can write my code in a way that it will ALWAYS be effective and compilable, no matter where and when it will be compiled. I simply hope so. What else can I do? Nothing much but moving farther away from what I really want: my own stuff, no dependencies and creating new solution or old or so far completely unknown problems (those "new" problems I mean are rather things that aren't really problems, but people not in the knowledge believe they are). That said, I'm only an odd little, creative slice of the baguette we chew on all day we power on our computers. *sigh*, this imagination in my head will never end. One day you think everything is so easy and simple, but then another thing pops up and you're confronted with the fact that everybody is stirring in it's own soup, taking a few croutons here but then smashing it back into another guys' omelet cause he thinks it's more tasty and so on... And I'm one of those dudes, great. All the time I invested in writing good programs and they I end up needing to know more than I originally wanted. Fuck it, totally.

Oh, goddamnit, why do I always have to help my sister cook when her Apple broke things down...


The more I think about more or less completely replacing the standard C library with my own set of stuff, the more I notice how awesome the stack compared to the heap is. Seriously, it's the best temporary dump ever! A few months ago I wouldn't have been able to realize that: using C++ and it's new quite often, always allocating around, just to get a simple temporary array done. I don't know why, but I think it's simply because I was too much into that OOP stuff to realize how bad it actually is. Or did I realize it but didn't want to have it real? Well, I don't know exactly. However, I haven't thought about more clever solutions to automatic memory mangement - the stack IS such a clever solution. Simply go up and down, use a relative position and we can do anything we want. The is no real difference to the heap except that it's managed automatically - so why bother with the heap all the time? Thinking about that, I start to see even more beauty in C's core concepts and how it will be translated to assembler code. So much stuff is done on the heap today, even for the smallest, highly temporary operations! What a waste, I say you. We are NOT in Java, we are in the world of conceptually viable programming languages. Therefore, I can only distrust C++ more than before. Dynamic memory is still an operating system problem today - not if you do your own management system of course. I started to redesign my own programming language to be more sensual about that matter and think it is the best to get a bit away from my initial concept, but still having every command as a bytecode. I'll try to get a symbiose between comftable highlevel BASIC programming-like simplicity and direct stack/heap relations like from C or Assembler. Most programs out there don't even NEED any dynamic memory. But still, they use it all the time and don't think about what might happen if they drop all the shit and go classic. It's totally possible to do custom dynamic allocation: just allocate a bigger memory and manage it on your own. Of course, it's easier to just call new/malloc, cause they do exactly the same but within a greate scale of course. Memory management is WAR, totally. Like it is with CPU time and other resources. Therefore, the less you have to parcipate in such kinds of brawls, the luckier you'll go out! Simple. That said, I can now only feel joy about how simple I can design my functions, the overall interface. I like clean and efficient code and I simply guess the prejudice found around OOP programmers that C is old shit and results in spaghetti, is stupid. I on the other side know the differences and can clearly say that both have very, very different assumptions about what good program design is. Instead of saying anything bad about C, I'd rather Java programmers spaghetti coders cause if something turns out wrong cause their models are too fuzzy to be understandable, they don't get what it is and start adding more stupid frames around it. Similar to how Windows grew weirder every now and then, those guys tend to not get it.

That's the problem with OOP: you can teach them how to use it and how to design the spaghetti models, but you won't be able to teach them it's underlying system with only that knowledge. If they try to enter lower levels, they'll stumble and rather hurt themselfs instead. But well, we always need pawns to sacrifice and triumph.


Yay, I'm back online! And now I have to utilize this to make silly JavaME assignments, Qt component servers and absolutely useless researches around digital crimes... Oh man, what a waste of bandwidth. Whatever. If everything turns out right, in a few days I'll get very own line with better higher transfer rate and with those stupid network controls my father kept since I made even one step into the worldwide web. What fucking dick this dude, living in the past since he once reached the present. Seriously, if you ever get a child, let it have it's own internet busyness when it's old enough and don't just hinder them from getting their own connection without your control. I can't believe he actually managed it to keep this situation that long.

But hey, this means I can finally install all those damn updates on my laptop! I also began to completely remove KDE and replace it with XFCE instead. It felt good in the beginning, but a few modifications and more insight reveiled a crappy settings system and horrible performance even for the simplest tasks. XFCE on the other site is fast, lightweight and does exactly only the things I tell it to. Simply perfect, that's how I like it! I'll probably, step-by-step replace my current Mint 9 setup with XFCE, too. However, I'm glad that all this finally solves and I can continue working normally...


Still offline

What the fuck, those telecommunication companies never get their asses moving. It has three weeks or so since out phone line got damaged and they don't even care about it. I don't know how it is in other countries, but Germany is really terribly if it comes to that. The only reliable connection you can get is other directly inside the cities (like Berlin) where everything gets a new layout once in a while. But living in the outer areas is either total hell 'cause nobody gives a shit on your problem or the best thing ever provided that you live far outside and know the only phone worker who's coming regularly to fix it. *sigh* Do you know annoying it is to install stupid Ruby fuck frameworks just to run a simple demonstration during a talk? Damnit, I'll ever use anything but C and my own programming languages EVER. Seriously, why do those fuckers create or even use it anyways. This ain't a pretty face you get after trying to copy with it. I'm sick of all this and the loss of internet doesn't to it's best to support me. Due to missing entertainment WITHOUT the ability to play my Steam games, I already watched through my DVD collection. Currently, there isn't much I enjoy except from the stuff I can't do. Everything becomes grey if something stops to work during it's highlight phase. And I don't even know how to change it.

Offline repost #9

I can't believe how much it eases a developer mind to NOT always worry about stuff beeing inline everywhere. Seriously, that was an absolute illness of mine. Why, how could I actually believe to replace the classic and persistently effectively with some crude constructs would be better and healthy... Oh boy. But I still believe that inlining is awesome for bytecode languages cause you will have the possibility to easily create and replace code during runtime, ultimately morphing itself into the perfect code necessary for the current situation! Yes, that's still the goal for my and my programming language: leading this concept to glory and success!

Offline repost #8: insight and responsiveness

Somehow, KDE seems more responsive than GNOME/Gtk. Simply FASTER. The fastest desktop I experienced in the past where Openbox and other ultra-minimal ones. But KDE really puts a crown on it with it's full desktop functionalities. I should've listened to my fellow students in the past. I haven't yet encountered someone using Linux and GNOME (except me, but that's over now!). Anyway, another thing came to my mind today while installing on the laptop and coding on the desktop: my paranoid inline use in C/C++. This didn't come on it's own, I was thinking about how to design the bytecode for my programming language. A little brainstorming session resulted in a deeper understanding of stack use in C and assembler. I won't cover this in total, but it's incredible that every local variable lies by default on the stack if it doesn't go into a register. Since my language doesn't feature any registers under the hood, I will need to heavily use the stack for anything internal - even global variables. The point is that there is theoretically no global memory in my language. Everything is local and variables in the upmost layer and visible to everyone. Easy! Plus that they can also stay in the stack. However, I realized that my excessize use of inlining if rather... not necessary. Think about it: parameters get pushed onto the stack, GCC is able to create specialized function instantiations for certain static parameter values. Of course does this not apply to already compiled files, but seriously - does it really improve performance soooo much that it's more worth than having a know, constant call time? Of couse can GCC prevent inlining in case of too big code results and analysing statistically, to squeeze out a bit more performance. But that doesn't change the fact performance will possibly bump back and forth when doing so. I had a lengthy discussion with myself about why I want to force everything inlined, making it even more uncomftable to manage and compile projects. And I came to the solution that it's better to not rely on inlining in this case. What will happen if decided to use it on the NXT? Will it blow up the file size and slow down cause I didn't use it wisely? Well, using functions reduces this to a minimum. A bit less performance, yes, but the problems always lie in alghorithmic and case-specific moments. Certain things take my time. This is one of them. And in the end I don't need to worry about anything and let the optimizer do his work. I'm getting into situations where it's hard to reduce code size in certain functions. Good that I sorted that out.

So that's it, here we go! New workspace, new philosophy and no sweets in sight. Where is this paradise anyway...

Oh, and you should really check out Kate as your new, favorite editor! It's a huge improvement over gedit or other silly gtk editors. May be a bit weird the first time using it as a previous GNOME user, but I can't help loving it's features so far.

Offline repost #7: Finally something civilized in this dark age

I had a lengthy session of tests with different Linux systems for my laptop and think I will leave it as it is until I get the next wave of anger or so. It wasn't for personal software fetishism but rather for my student job - the tools they want to use can only be installed on Ubuntu 10.04... Quite annoying cause I was never able to completely install it. So far I had tested many things: installing it on the laptop, on the desktop, in a virtual on the laptop, in a virtual machine on the desktop... Non of this worked and I already damned the whole thing. At that time I totally forgot about my little collection of Linux distributions CDs... And out of a weird coincidence, there was also a Kubuntu 10.04 CD on it. I tested to install the required compiler tools using the live installation and it seemed to work - no error cause of wrong distribution name and nothing else except a missing fakeroot package. So I put it on a USB stick to it with better performance but needed more free space on it for the toolchain. You guess what I did, I'm using it right now. So I can install and work with it. If everything goes right tomorrow, I can finally take a look at it and learn, learn, learn... Fuck, that took me long. But now everthing should be fine and I'll hopefully stop installing stuff all the time... Fucking software.

Offline repost #6

huzzah! i took the time to implement a function rarely used not required for a fifo stack implementation of mine. it's called "crosscopy" and moves a memory block from a_s to destination a_d and a block from b_s to b_d simultaneously while a_s and b_d may overlap as well as b_s and a_d but not a_s and also not a_d and b_d. While this is a very special situation, it bears an algorithmic problem that's nowadays either solved by creating a new memory that's used to rearrange them without destroying the original memory. However, I decided to not use any mirroring or excessive XORing and solely work in-place. I find this especially challenging and also really usefull for places where memory is as limited or you can only safely access a certain area without overwriting other stuff. After figuring out a working and not completely optimized algorithm (in terms of code size), I noticed how effective this could work in combination with defragmentating harddrives. I think it's possible to rewrite the algorithm, to feature n blocks of crosscopied memory... If this will work, I can write my own fragmentation-based file system! Hooray! Or atleast I can used this as a base for memory allocation using handles instead of direct addresses. For a realtime allocation... there are definitely better variants out there. I can already imagine one... However, this stuff has quite motivated me today after this dilemma of always fucking up my system. Weird that this stuff also makes me happy after a bad day.

Offline repost #5: getting tired of this shit

Indepent from what I'm doing - without internet, Linux is totally garbage. Distros suck at their default installations, it's barely possible to download complete including all their depencies in one piece and you can so fucking easily screw up the entire system you by doing your normal package work... AAHHHHHHHHHHHHHHH
I'm so fucking tired of those damn computer systems right now. And it's not the hardware, it's simply the software. Hardware isn't a problem unless you something that you build on your own. Now, it's ALWAYS the software! In the past it was hardware AND software, and now the most important thing to me these days does work less than before... Damnit, I hate this so much. If I'll ever going to live long enough to manage it... I'll code my own operating system. On a fixed platform (even if it's a fake OS inside a virtual machine) and with ALL my software experiences and solutions for existing problems I have with all these systems. Something completely, entirely custom. I give a fuck on compatibility, I'll simply strife for a completely custom OS only with some necessary drivers or so. And ya know what? I'll start coding it on the NXT! Yes, for fucks sake! This will be the first test: A minimal OS on an extremely simple machine with only a few driver functionalities. I have my ideas for filesystems, I have my ideas for multitasking, complete control by user etc... I don't want all this atomic candy you get everywhere, I want a 100% working and stable systems that's in every case 100% error-safe and can cope with all the shit that's possibly gonna happen to it. I have MANY ideas about how to archieve this and the good thing is that most of it can be done dry inside another system cause it will, in the end, always be memory to modify and operate on. Even if it's not, I can still get over it. I can't say how much utter anger is bubbling inside my head. Either you take all that's given to you or you do not. Everything inbetween doesn't you anything but shit.

Sometimes I think we need the times back where everything went inside a single machine and we didn't need to bother about it. Like with most old Apple computers (not counting the modular ones). Maybe I can find someone to help me building a own computer. That means own CPU, own harddrive, own RAM, etc. Doesn't have to be so tiny like nowadays, but simple a few kbyte is enough. I want to have the feel that there is sometimes that doesn't depend on old computer designs, systems and software. I want something completely knew, something that works on it's own and with ultra-high stability in mind. A conceptually pure computer system with no flaws for a programming and playing user like me.

And even if I can't put this into a real, pysical machine, I'll make a virtual machine out of it. Few hardware sketches, simple I/O system, one allocated block of memory and done is it. But that's the sub-optimal variant and I'd rather prefer executing it a real machine. And I don't think the Pandora will deliver a good platform for that, except there are people helping to get rewrite the drivers.

Meanwhile I'll need to think about whether I really wanna carry on Linux. And when, how I want to carry on. I surely don't want to have any Desktop environment shit anymore. I had so many problems with their installation and removal... It's simply not worth doing it I think. I only want to browse, watch videos and develop on it. And sometimes a bit Inkscaping, but that's all I need or want. So it's probably the best to think about only have this stuff in mind and let everything else run in a virtual box or so. I can't stand messing it up again... I stopped counting how many times I messed up a system because those annoying package manager did everything wrong. I simply want to let my laptop only do those few things - I know how to mount stuff and manage files via commandline, I also know how to configure and run my uni network... That's enough for basic stuff and if I want to use a GUI, I can start windows or my old Linux Mint edition on the desktop PC. Though I'd love to NOT have a Linux on it. I got so sick of all this stuff. But what are the alternatives? I can remember of FreeBSD, but I don't think there are necessary drivers available for my ThinkPad. And I also don't know ANYTHING about it except that it's not a Linux but rather Unix (as right as I can remember). Is there anything out that's possible see as a suitable alternative? With hardware support? And no so annoying things? I guess not... But before I try to tinker with this, I'll stick with windows and my desktop at first. My current situation is a bit like noticing that there was virus on your installation CD after already having it installing on a dozen of computers. You'll have to redo it, but they are all not connected to any network. So it becomes a matter of act, not of time.

Anyway, I have time to think about every aspect of what I want. The great thing about C is that you don't need ANY of it's libraries except stuff like stddef.h or so. You don't need to use mallocor stdio.h, you can write everything on your own - memory management, IO, string and character management, memory models for file systems, thread synchronization, etc... can completely be done without the need of actual binary code. Only thing that sucks is that you can't work with size_t and void* without the previously mentioned stddef.h. I guess they were added later, but I don't understand why they fuck they put it into a seperate file... Geez, sometimes I really don't know understand earlier decisions in this language. Hm, that makes me think... I still haven't begun to continue my programming language work (the one with the bytecode). The syntax is done, yes, but the commands arent't. And as it syntax and commands melt there, I find it hard to find the actual feature set cause I got completely in uni work and other programming stuff lately. Though I know how it should look and work in the end, it's difficult to simply start with it. And I'm also not sure whether it's actually useful to not include special syntax' to ease the amount of code to write you'd need with the syntax I've chosen. Reading about Ruby's smart handling of sequences and C's ?: operator are just some examples for cool things I'd love to see consequently in a real language. Hm, maybe I should include a pre-processed command to translate special notations into bytecode. Hey, this could be done as bytecode-insertion, too! Yes, I think it's time to finish some things in my toolkit and start working on it again. Simple steps, heavy harddrive use several layers of functionality and code execution. I need a primitive, standard function base to execute the first code generation step which will then make up higher level functions to possibly generate even more things. I don't think I need sooo many functions in my toolkit to do this. Just a few, very effective ones. That's the reason why swords are better than chainsaw: simple and effective vs. hard-to-make, bulky and ressource-demanding. Plus you don't need a whole industry behind.

Anyway, I've already done most necessary memory models in an imho very efficient design done. I'm a bit curious: what if I can completely build my own operating system using this bytecode? I mean the problem with operating systems is that you'll need to write many, many machine-depending and error-prone things if you insist on hard-coded C. I won't use this bytecode on the NXT for sure, but I'll probably need to write a completely virtual computer for this, so there WILL be a bytecode I need to execute - no matter what will happen. But if I completely rely on seeing everything as a soup of connected components with a simple but universal communication... Yeah, that should work nicely. Hm, this makes me think about combining my dec-jump instruction set with this language... Not with only those two instructions, but the basic assembler design I had for this. Then I could first write the assembler in basic bytecode and then write a compiler outputting real bytecode for the virtual machine. Yeah, that it's an interesting idea. So the assembler should be close to the final, desired byte code I had in mind. But since I designed my language in a very uniform way, it's pretty much the same format as I would design an assembler language. Though complete NOT placed in reality and reallife computer design, it's still something that works the same, just that I make it more visible using my languages syntax.
And about the more symbolic, command-less syntax: That will be a special notation with a fixed set of input and output variables. I already have ideas for that and I won't limit it too much to premade things. Hm, I still have the whole world of bytecode generation open... Guess I will be able to think of something that generates code and not just function pointers! Oh, this could also make it possible to write the thread manager itself generating or executing stuff until address x... Wow. That's almost assembler code generating assembler code - a system based on self-change and mutation. Heyheyheyheyhey, isn't that the most awesome thing ever?

So it came once again close, didn't it? My anger about operating system, about Linux, and bad software and user interface design in general - things have to change and I want to do my part. If I have enough time to keep this line... it will be FUCKING AWESOME. Simply think about it: a self-extending computer system able to create, alter, manage and optimize itself from a minimum of base code. Gaaahhh, what a mindfuck.

Offline repost #4: Evaluation

I became bored when always pushing stack data back and forth (also, it's quite hot in here for such stuff) and somehow decided to tinker with the preprocessor again. The macros I once wrote for functional proprocessor programming came back to my mind and restarted the whole thing. This time, I understand the possibilities a bit better and think that I found a practical way how to introduce a bit of flow control-based programming. In theory, you'll need to put everything flow break (if, loop, while, break etc...) in an own macro to the commands be evaluated from top to bottom. The only way to get variables is to pass parameters. Therefore, every line requiring variables or parameters will also need macro parameters - one for each variable. To call this function-like block, just call the first macro/line (should also get named properly). Recursion of even calls nested inside other calls will still not work without duplicating it. Whatever. I think it's ok for basic functionality you can use to generate code. Somehow unbelievable that it's able to program numerical definitions using the preprocessor. To create a complex language, it requires a more primitive one - basic rule seen everywhere in computer science.

Offline repost #3: The wondrous wonders of macros

The wondrous wonders of macros

Macros are very useful tools to create something completely abstract and out-of-content: They don't require types, they don't requires prototypes. They just are and work in their own ruleset. Before I started using C that much, I believed that macros are completely evil in any possible circumstance. That's wrong. That's entirely, totally wrong. There's nothing evil except human fail and desire. The line that macros are evil is mostly used in combination with style guides, good practive write-ups for C++ and so on. It is true that C++ doesn't need macros as much as generic C does. But it's not that they screw up everything. Their are situations where macros are very, very useful - range collision, series of logical decisions to determine certain relationships between already calculated values and so on. Macros shouldn't be confused with functions and never written to do alghorithmic behaviour except you're planning a more complex system of code generation that should be cleanly seperated from normal code. I think that's also the reason why some people spread the word "evil" round the coder globe. Most C++ don't know how they are useful cause they don't program in C and have other problems to cope with. In general it's something bound to C code rather than to any other language. One can do simple and very efficient things with macros in C as well as with inline templates in C++.

It's sad to see how rarely it's explained what macros are useful for, that they are not evil but lost their purpose in contemporary C++ programming. Maybe that's because C++ programmers don't have enough C experiences to round it up or just because they like to plant nightmares in newbie programmer's heads. Let's say macros are a powerful but rarely to use element in C and a compatibility artifact in C++. They have their reasons to exist as well inline templates have. It's like always: C offers you awesome freedom while not taking responsibility for the results YOU programmed. It's a liberal language with some simple guidelines and a lot of syntactic sugar. Nowadays it requires more professionality than in other languages and a bit of perverted taste. Where others force to HOW your application is build up, C only forces you too keep a basic paradigm (and even that can be avoided if you need to). No classes, no framework entry points, no events, only simple functions. But almost everyone forgets how - theoretically - few facts one need to know about the C language itself to archieve almost anything other apps do nowadays. It's not a simply language but it's very, very small. BASIC languages on the other side demand you to browse through many, many API listings. C does have a standard library, yes, but you can also program in C without it. It's just not necessary cause it's also written in C or sometimes assembler. Java on the other side is imho quite strict and has more rules to force you coding in a specific layout. Is that easy to you? Well, if want to write like Java say you - it is. If not, it becomes a cruel and uselessly complex reign over programmers.

Oh, I'm tired of letting me force how and what to program. I see C as a free mind in that soup of programming languages and paradigms. It's the organic of informatics, no... but it's surely "the craft of an old age". Though not necessarily a more civilised one...

Offline repost #2: no more inline worries

Hehe, I pimped my inlining possibilities! If I now want to inline of my seperated files, I can simply define ITK_ENABLE_INLINE before including the header and all included functions supporting inline will be inline. If you don't inline or need it seperately, you can still compile it standalone and it will work the same. Nice, eh? I really like this. Combined with a configuration file to setup the necessary macros, I only need 4 additional lines for each header file and no additional file for the c file. Yep, that's it. Simple stuff, but awesome if you need it. And it will work for everything that's inlinable... awesome! So there we are, configurability. This system works not only for C, but also for C++ as it bases on the same principle. Too bad I can't make the preprocessor more versatile without breaking the C standard! It would've been really nice to automate this for more, different macros without redefining it all the time. However, this is better than syrup on pancakes. I just love listening to good music and coding something that's simply awesome although not actually used anywhere. Every other student I know rather prefers coding applications and not "technology". But in the end I always do the stuff they don't want to care about. And then we get shit like Java and all that stuff around it. Yap, yap, yap... that's the sound of INVULNERABILITY. Oh, I'm sorry - wrong script!

Offline repost #1

Usually, customers come back if the product or service was good - that's true for merchants, hardware manufacturers and software. Quite dependent on price in the first place, it doesn't matter in the second. After not beeing able to get Arch ready for production without internet or wireless network and having only incredibly horrible Ubuntu variants lying 'round, I decided to simply take what I found fascinating and practical instead of trying to think WHAT I'd need in a distribution. So, what where the distribution I had most of the time which served well in most vital situation? Linux Mint and Debian 5.0 Stable. Mint uses Ubuntu as a base, but more de-sucked than official derivations. Debian 5.0 was... old. VERY old in terms of updates and available software. I knew that Linux Mint offers a variety of different tastes including XFCE and KDE. But all these were still Ubuntu... That operation system which never worked with my study place' wireless network. But Debian worked with ndiswrapper - quite well I have to say! But well - age pays off in different ways and I always believed that more recent stuff like Debian Testing was as instable like CrunchBang I used a small while. But in reality, many people used and still use it as their stable main system cause bugs can be removed quicker than with frozen releases.

So... Linux Mint has a rolling release Debian Version. I always wanted to test it, but didn't want to reinstall my system while it worked. However, I almost USED to reinstall my system every couple days (this isn't funny anymore), so why fuck don't combine those two experiences? Debian itself is fine and compact - Linux Mint brings all the desktop media stuff you'd usually install later. So I dumped it on my harddrive and it was the first time in weeks that I felt comftable with what I saw. I don't have to tinker with Arch's lofi minimalism, don't need to get angry about Ubuntu annoying system screwups and can profit from the huge collection of binary software. This time I installed a x64 system - never did that before, this it might result in a better overall performance.

And it shares the same rolling release nature with Arch. Updates come on their own, nothing needs be reinstalled, the get the newest versions, etc, etc... Will leaving as it is now. I knew Debian long enough as well as it's default Gnome 2.30. But I don't want to keep it - I had enough trouble with it's more tedious parts. And what a luck that I have the whole range of desktop environments avaible, right infront of me in Synaptic's package listing...



I'm currently learning Ruby for a talk i'm doing with some student mates in several weeks. The talk itself is about Ruby on Rails (here we go again, my beloved "web development"...) but I'll do the technical part - learning the language, coding an example webpage etc... Not sure what the englisch book name is, but what I'm currently reading is a nice and direct introduction to Ruby's syntax and functionality. I can remember me sitting in front of an old CRT monitor, barely able to even write a C main function and trying to cope with Ruby and it's horrible Scite editor. I didn't knew anything about it, but they told it's easy to learn and so on. In truth, I didn't understand a thing. Freshly from RPGMaker and used to simple BASIC-like commands, you just couldn't tell me that is a variable and a class and a loop and whatever else... Didn't understand and dropped it. Now, years later after Ruby became some kind of "new web tech blabla", I understand everything what's written in this book. The problem with script and highlevel languages is that they use concepts a completely fresh programmer won't know of - even if the book tries to teach you it. However, it many really good ideas like negative indices for reverse index access, additional operators like the <=> for giving back a range between -1 and +1 depending on value comparison, multiple variable on an assignment statement etc... Tiny things, but these bring a bit of... trust. Yes, trust! Trust and hope that it has more of those nice, tiny things and also the naturally builtin hash support. I'm quite a syntax fetishist and now why I learned to like C more than basic. Though Ruby supports some imho a bit too varied possibilities of calling functions and grouping code, I'd currently describe it as "awesome garbage". "Crazy" in the sense of that there are actually useful features that don't into the Java areas of syntax and "garbage" cause it's not very useful apart from scripting and a bit doubtful in certain choices of syntactical freedom (means there are some hard to read but possible constructs aka "print unless evenxy" and such - weeeeird!). However, it IS a scripting language and doesn't even try to pseudo compile like Java or so. That makes it better by default - better stay close to the spirit of ya offspring medium.

So maybe I'll keep it in mind as a quick "need to test this" or "that screams for a script" language I could also use later. Yeah, I'm looking forward to it. There is a reason why I read books like that. It didn't work with LISP (besides the fact that there are millions of completely different implementations), but maybe it works with Ruby.

C Advantages over C++

It's interesting how many actual advantages C has over C++. Of course, both are equal when rating in accordance to their respective domain, but C is in general more "upwards" compatible than C cause it's one level lower. My "beloved" (I'd textually vomit here if I only could) lecturer can't deny that a completely inlined C++ wrapper for a OOP-ish C API is as efficient as it's native C++ counterpart. Thiscall on a side, most work of program isn't done through the this pointer but rather on an alghorithmic level. Compilers can surely optimize on their own to detect any widely used symbols. Imagine a function that's only used to get some parameters from an object and works on completely unrelated things later. No this pointer necessary, it's register could be used more efficiently for internal use or even completely rearranged when inlining. Even if one DOES actually need a highly responsive this/self parameter, there's still the possibility to flag it is a register...

Whatever, using C gives me the ability to code in C++ too, using a simple wrapper. I like the idea the feed both ways to program cause not everyone is used to classic procedural programming. All in all, good designs in C and C++ are most of the time as equal as efficient. So if a lecturer or student wants my codebase to also be OOP - well, wrapping is not much a work for me... Unbelievable that there's such a huge difference between both approaches. I mean I guess wouldn't have turned into a C-only programmer if OOP wouldn't be so stupid in C++ or other OOP languages. I have the feeling that designers of OOP languages are either ones that learned OOP as their first language or just some lame procedural programmers who couldn't bear the shame of writing spaghetti code. Or just some guys who want to force others using their designs. Yep, that could be. Whatever.

I guess my struggles with OOP implementations and designs can be read in large quantity in this blog... We all should take a deep breath and try to "code in harmony" instead of bashing ourselfs. Or maybe that war does only exist in my and my lecturer's head... Some students I know do even feel a bit "useless" if it's about those non-highlevel details and programmings. The whole holy war of OOP might be a problem limited to a few expression-heavy people like me or my lecturers! So if this bitch wouldn't exist, I would of course not choose the hardliner path. She loves forcing all people around her using OOP. I'm currently using the "backdoor" and code my project with another group doing Qt and C++ instead of Java and JSP. Seriously, nobody needs those millions of web frameworks. I don't say we don't need them in general, but there are more frameworks and web programmers than anything else a level below. Somehow stupid. The higher abstraction and lower do-it-yourself goes, the more weirdos and lunatics come in claying the throne of information technology for themselfs. The fact that there's even hardware able to execute Java bytecode it realtime makes it even more scary...

One day, I say you... one day I'll put a complete alternative to OOP programming that's more efficient, more minimal, more generic and way more flexible than any OOP incarnation - all at once! However, this will take some time and it's good to know that I'll probably live atleast 20 years more. After that... yeah, I should've archieve something by that time. But since I'm currently writing all the stuff in C I couldn't write before in C++, it has already started. And then - one day - my language will be ready and loaded for maximum carnage.


I thought about the stuff in my previous about marking the this pointer as a register. It might go totally wrong in some situation and in some not. I still follow the believe that it's not necessary to think about that if the function will be inlined. If it's so important and performance-critical to have this in a register, you can also inline it. Pushing local variables and parameters on the stack can be far more demanding in terms of instruction count than simply inlining it. On the other side, that's probably simply garbage! CDecl is stack-based, does only reserve the eax register as return value. Therefore, I can't think of any way to explicitely say that this goes into a register. Or well, it probably loads them from the stack to registers, hm... Yeah, that's probably the case. Too bad I'm currently hacking on windows, so there's no way I can test it's output with no effort. However, I'm thinking about introducing a #define-based pseudo thiscall system that should also work for inlining stuff and so on. I know that the crux of inlining is too important too ignore it completely. So, depending on the included files, the library user should decided whether to include the inline or classic function-called files. It would be quite simple in theory. Define a prefix for all functions, put the funcs into a header file and include it in a c file. So depending on the prefix it can either get inlined using the header or will be normally compiled into the C file. Need to test this out one day. However, I'll stick with the inlined header funcs at first. Is just nicer to code using it.

Werkstudent, bitches!

Yay, my lecturer finally gave green light for term break work on Embedded Linux programm in C. That means I'll a) get money for it, b) can program C on a commercial level, c) get to know more experienced C programmers and d) learn from them as well as e) have to setup with the toolchain in a virtual box. Yeah, that's right, I need to install it in a virtual box cause the toolkit only supports Ubuntu 10.04 LTS as development platform. That's ok, but at the moment my internet connection at home is broken (I'm writing this offline and during my lectures) and before that occured I decided Arch another try (Xubuntu has so many disadvantages in many, many ways... unbelievable!). So at the moment I have a minimal terminal installation with nothing but the very Linux tools and interfaces. It'd be ok if I could only get wireless running. It's currently the only way I can get internet and fuck, it sucks...

If all this stuff works, I can setup a little environment and play around with the SDK. They use "Code Sourcery", hm... Never really heard about it. I'm a bit pissed today due to that. I would love to say that I already got the stuff tor run properly. The websitementioned something about emulators, hmm.... I really don't know what all of actually is cause I currently don't have to ressources to install or even study it, but whatever...


That's total shit today. Never play too with your Xorg server, you'll surely regret it.


Sometimes, when everything seems ready, when your functions fill the pages of your editor, it's lovely to see how many things hide behind it and they don't suck by your definition. I'm satisfied, I'm totally happy. Nothing beats a hot summer day with refreshing rain in the evening and a beautiful, fulfilling function writeup. Throw away your sorrows, watch a wholehearted movie and let the breeze of trust and satisfaction wind up your hair.