Originally, I wanted to write a flexible lifo and fifo stack system whichs works little the stack pointer in assembler: simply adding and subtracting stuff from the address depending on the pushed/popped datasize. Well, I noticed how useful and minimalizing a memcpy that copies backward instead of forward would be. So I thought it shouldn't be hard to and started to code... Writing memory copy functions is NOT trivial! What do you do if source and destination overlap? Will you want to copy byte-wise or int-wise with a byte-wise rest? Is your copy function so big and complex that you can't inline it? Question you might not think of in the beginning. At first I wanted, like always, a complete comfort package catching every overlap, copying always int-wise to minimize the necessary instructions, beeing ultra-small when inlined and so on. Well, that's a total myth. I'll probably write several function that do things differently. All in all I think it's totally useless to write special functions for forward and backward copying cause it's simply not necessary. Whatever you do try to duplicate this way, you'll always have a start and an end or a size. and if you don't have one these three, you can calculate and don't have to implement a shitload of funcs just to have them in case you'll need it one time... That said, I stumbled into something I can again not leave that simply. But it's interesting how close this situation comes to copy functions in general, which I always left out when writing array operations in C++. On a level where you don't need to worry about overwriting stupid class IDs, it's way easier to write copy functions. Did I say how I hate C++' hidden RTTI system? It plainly sucks. I never really used it and I doubt I'll ever really will... However - back to memcpy - it's surprising how limited possibilities for copying large blocks are and how more limiting the possible combinations appear. Theoretically, you can't just copy int-wise while not requiring non-overlapping memory area. The reason for that is that memory is addressed byte-wise: copy one int into it's own body and it's only possible to operate swap-less if one takes the minimal representable unit for the used address system and copy it from end to start instead of start the end. You'll never know HOW the exact copying is done inside the processor. I think there is potential room for processor improvement if you ask me.. However, this means that I'm only able to int-copy non-overlapping areas - not a too bad decision cause I only need to implement two rather simply functions... I'd love to freely combine alghorithmic properties during compile-time. This way I could use smaller subsets of copy functions and combine them on demand. I know that GCC is free to reject any inlining whereever it occurs, mostly because it increases code size or register use (atleast this was my experience). So whatever construct you create, there's the possibility that it won't it's way situation-optimized. In theory, a clever idea to compare inlining and function calls depending on that. But in practice there are so many lowlevel functions that would benefit from situation-dependent optimization. It's barely possible to create a complex function with many optimization for many cases and situations, if it's still called cause it's too big instead of inlined. There's almost ALWAYS an " almost perfect" way for every situation in every code that doesn't depend on runtime data. I don't want to judge about the developers of GCC or C cause they didn't create it to work as a sole optimizer but rather many platforms supporting compiler for many standardized languages. I'd be a fool to judge over this immense amount or good work they did and better create my own compiler that compiles to C code instead of "blaming" it. Did you know that C was written to give an abstraction of assembler code? That said, it's the only actually portable way to to export your programs. I like C cause it's exactly that: following it's spirit in almost any way. I have my complains, yes, but these are a matter of facts and design history - not false philosophies.



Oh, I rediscovered my old music folder (most tracks can also be found here under music). Really, I didn't knew how much charm and style lies in some of them! Especially those DS-10 ones are simply awesome to hear in the background. Need to some new tracks on it, this always pays off...


Well, comfort package and all that

For ease of installation and maintenance, I kind of decided to go back in the Ubuntu direction of Linux. Not the original Ubuntu or Linux Mint, rather the XFCE-based distribution Xubuntu. It's a bit of backup and a concesson on my inability to NOT always play with system settings and deletion of important desktop environment elements. A system like Linux is simpel if you know about all the places and settings, but totally different if it comes to the combination of OTHER software that hooks into the chain of execution. I never understood why the fuck my window and disply managers mess of up all the time. The whole xorg thing requires further reading to change as often as I do. Well, seems that I'm just not as crack as I thought... Gotta say it was more some kind of prestige thing, cause one can also do all the stuff using another distribution. However, it frees me from using GNOME all the time! I'm tired of it, all the time I try to uninstall it's components, the whole system gets destroyed or atleast fucks up xorg. I was never able to get it back to normal yesterday night. All in all I think it's easiert for me and my reinstallation obsession to simply use one distribution (and no windows alongside) and use the whole harddrive for something that proves suitable over time. I only have a few files I really need on a system (mostly source codes and distribution-independent binaries), so it doesn't hurt to test new things more openly. One idea to test things in a more stable environment should be to use a virtual machine before installing it.

Atleast for the student job I hope to get (well, it seems that it won't be that much of a money-based job in the beginning...), I'll need a fresh Ubuntu 10.4 installation to execute the SDK tools. If I think about it, he never actually mentioned something about getting paid for it... I must be such a fool to do this. However, I could get a real student job there if I prove "worthy enough" (personal interpretation!). It's quite a profilic company, small team of 9 people to work with. I hope so much that this is going to be something I can make a living from. It's always good to keep connections supplied, there might grow a plant out of it...


I killed my window manager/desktop environment setup on Arch Linux with a bit too aggressive uninstalling. That's always what's happened if I switch distributions... Well, I reinstalled it but messed up GRUB (though I could use old installer...) and will probably have to install it again if I want to continue using it. Fuck, I shouldn't always desire a change in the desktop environment I use. I think due to the fact that I only need handful of apps and can develop and work using the command line changed my point of view too much. Isn't there some point when I'll stop with it? Probably no, cause I love changing all things dramatically from time to time. It's the counterpart to why there are dudes out there always repainting their rooms, buy new clothes etc... It's the wish to change everything existent and simply go wild.

Yeah, I'll most probably NEVER stop doing this. But change needs to happen somewhere. Either now or later.



I haven't noticed how utterly spoiled I became over time when it comes to designing actual program functions... I already knew that I have problems designing bigger problems cause I most often start to think in terms of the technicaly implementation and not functionality. That's probably also the reason why I never finished or even started anything. However, technical designs are most of the time really easy to create and over time also quite easy to optimize if you're into the subject (not talking about the endless quest of perfecting something). Today I created a little design document about a REALLY minimal instruction set using decrement, jumping, some program counter commands, input/output for external devices and a custom assembler notation. I tried to begin a parser and executer for it, but... I couldn't get it done. I already wrote a shitload of parsers and didn't want to repeat it all over again... So after a while of "getting into it", I faced the necessity of some actually useful, something that's helpful when developing applications! A compiler IS a application, though not in the contemporary sense. And one by one I started to write header files of stuff I'd need for comftable writing all the things I want.

I never started writing those kinds of interfaces before, I always started to into implementation... What a mistake! You get much better ideas and designs by planning this way. Implementing it is a no-brainer in most cases...


I had a great today when I shot a few things with my Lego gun. I always tried to make the cartridge lock with a rotating hook - ok for low power, but fails at higher level strengths. The only way to solve this is introduce a completely horizontal/vertical lock without rotation. I always disliked the idea cause I didn't really know how to trigger the cartridge without removing it manually or even quickly. But on way is to use a lock pin that can be "kicked" from the side or top/bottom, resulting in a possible quick shot release. The theoretical model for hammering it would be a rimfire-like hammer or a "fire pin" arranged to to stay 90° towards the gun body. I'm sure the latter variant is heavy and unreliable, so I'll try to use the hammer variant. Yes, I think that's a good solution to this problem. Yay, finally something different that could make bullets better and stronger! I longed for that idea a few weeks ago... I need to redesign the planned gun model and mix it with this new variant. I could create stunningly far-reaching Lego guns I think... holy crap!



I wonder if I could somehow leave all OSEK stuff behind and create my own system by implementing a virtual machine or something like that... I read a bunch of things about the ways of uploading and permanently storing data on the NXT and I think it's quite easy to do so. Getting the leJOS source code will probably help me to understand what it does, where and how all it's functionality comes to existence. I often find myself in situation where I'm too lazy to read more about how existing technology works, so instead I try to get my hands dirty and learning about it. Well, it's one of the reasons why I never actually like to create normal everyday applications, but whatever...

Reading and learning

Geez, it's way more comftable to have the full system power on a huge screen while reading APIs and playing delay-heavy games (I recommend automated baalruns in Diablo II). nxtOSEK is quite an interesting system, not to say a framework in the that you need to create tasks, implement pre-defined system hooks etc. It's ECRobot API is also just a wrapper for leJOS driver functions, so it's defined to harmonize with OSEK and hooks in when the OSEK system start. It sets up device hooks and possible interrupts, so that everything can works on it's own and you only need to provide the logic. Really smart on paper, I wonder whether it's also nice to program. Whatever - I still have to hack my way around the sound API. It's great that all functions are required as sources for compiling, so you can simply include and use it from there. Will require some deeper studies, hm... I hope all of this will turn out well. I don't want to see myself beeing depressed just because I wasn't able to use leJOS' lowlevel API.

Oh and choosing custom project locations really is way better under Linux. I'm currently trying to find to set it up on Windows, maybe I find a better way instead of dumping everything into the nxtOST directory. It bet it's easier if I put it under my home directory cause it shouldn't matter where to execute the build process from. Or I just did some wrong in the path configs, hm...

gedit on Windows

Holy cows, someone ported gedit to Windows. That's awesome cause I can no simultaneously develop on Windows with the same things I already used windows including gcc/make via cygwin and a proper nxtOSEK installation. Not rarely I wanted to also develop stuff using Windows cause a small laptop isn't always suitable for everything. I also have Linux Mint installed on my Desktop but that's not as convenient as I'm not able to also use Steam in the background to download stuff... Additionally, not every proprietary thin is available under Linux and some day I also want to develop a VST plugin, which's not available on Linux... However, it's good to know that I can also develop and upload nxtOSEK apps. Interestingly, it's a bit smoother in cygwin cause you can use the givin shell scripts for automatic upload... Of course not as flexible and direct as under Linux, but quite a plus I have to say.


Though the day was horrible and I ran completely out of energy, I think I'm done with my new and revised set of generic linked list functions. I have nodes with n directed connection, singly-liked lists and doubly-linked lists. As far as possible, I made then static inline inside a header file, so that they effectively function like inline templating for every newly parsed file. The interface is a bit uncommon and not made to be highly comftable instead of generic and flexible und application. However, it's a quite interesting thing you can also utilize for specialized lists, if you prefer having a comftable and fixed interface. It doesn't waste any pointer, comes in atleast three flavours and seems to work quite welll... The more I design those simple procedural interfaces, the less I have to care about overloaded operators, possible other functionalities etc... It feels more like a "whatever comes to your mind: just do it" thing of creating programs. Well, I was always rather fixed in how I design my programs, so goes together, naturally.

Naah, just fuck it and be done. I don't even have a list of things I still have to do for my NXT project. I thought the thread layout, raw implementations features but not about the actual tools I need for that. I know I need reliable and flexible data structures list and trees which I've already done. Then there's the geometry stuff left I'm kind of afraid to tackle. I think it's best to find something else instead. If you have all that stuff and some other tool functions, there's everything you need to begin your custom project... Why am I always afraid of starting something actually working? Why do I always tinker with project-less function sets and class collections? WHY THE FUCK IS IT LIKE THAT? Damnit, nobody knows that. Maybe I'm simply stuck somewhere at this point. All in all I always praise how great this and that is, but I never start doing something that's... visible or can be used by someone. I hate that. I simply can't motivate myself to do something like that, maybe I'm just afraid of the stuff that's related do all the work connected with actual software but not the class libraries behind. That's... embarrassing. Really. A programmer not beeing able to create a typical piece of software on his own because he is afraid of doing trivial work instead of mindfuck pointer arithmetics. Yeah, that mus be root of all evil. Though I can remember that I already created software back in my BASIC days. Then, when I tried to create games again, I got lost in those evil cycles of hiding... Hmpf. I need to break this. Time to create tools for actual software development and not perfecting some rather useless ones. And as you can see: C is helping me on the way to simpler and quicker to program software. Uhm, atleast I guess so. But it's a fact that I was faster coding and testing it that I was with C++... (mostly cause I can't do so many rather useless things in C).

Inlining, third

Yep, I was right. The only way to "inline on demand" is to create a header file with static inline functions and include it via the file where you use them. Though it's also not correct cause using static functions with inline keyword will also work. The more you put in one file, the more GCC is able to optimize out. I don't know what dimensions this could reach in bigger projects and back-and-forth referenced files here and there but I believe it's as "cruel" for the compiler as instanciating template classes or so. The nature of a single-pass compiler can rarely be overcome with simple measures but can sometimes reveal some interesting simplicities in compiler design. However, I'll stick with the static inline approach for memory and numeric functions. It seems to work nice within it's bounds and I'll later be able to debug easier with all those possiblities to still get the jump stack with non-optimized compilation. Yeah, that's it. I found a way more comftable for creating code than with C++ templates! Plus, I finally realized how useless it is tinker with such in stuff in C++ cause it's in easier to realize in C. Another plus point: I can export every function and make libraries out of them! Good solution, really good solution. In the end there's always an optimizer made to optimize away situations no everyday programmer would recognize without diving deep into platform-specific attributes...

Yes, optimizer, YOU WON. For now...

Inlining, second

It's interesting how extremely differently C and C++ handle their inline specifiers. In C++, if I remember correctly, I could inline everything and it got placed everywhere as I told it. In C you can only really inline inside the file where you defined inlined functions. You can try whatever you want, inlining only works well when doing it absolutely locally. The GCC manual mentions that inline definitions should be findable inside the same compilation unit where they should be used (I guess that means the same file after all include operations occured). You can't simply templateyfy a million of things you think are good for the program. In fact, your only option is to rely on GCC's optimization technology beeing smart enough enough to detect what would be better (aka functions calls vs. inline execution). After a while of playing with the assembler output, I noticed that almost all my functions ended up beeing inlined! Well, should I really worry about this stuff anymore? However, the two which GCC didn't inline were an allocation function (understandable, that's a hefty and not really lightweight process) and (strangely) a simple function for inserting a list element. I don't know exactly why, but it told me that it'd be too big in code and that it didn't inline it. Well, that's ok I think? I don't know know THAT much about how GCC detects it, but using the more direct approach and using functions to link list elements, it started not inlining those. It has probably something to do in in what translation unit this stuff was build. Inside the insertion function I use link too, so it probably inlined them cause they were in the same file. Can I somehow utilize that and mimic that all the time? Not sure... will probably test a bit around with it. However, I currently don't have a problem with that as it will mean I can simply put all critical things in a huge set of functions and here we go again. Let' see what GCC has to say more about it.

A declarative challenge

I noticed a quite stupid use of double mallocation in my list node code which's not actually effective in several ways. I used a dynamically allocated arrays for storing the links inside a struct, which should normally be static. So if you malloc the entry itself and the links, you don't really need to cause a single allocation can do both things. The problem is that you'll need to self-reference cause a link links to a collections of links. Without starting an identifier that encapsulates the array and thus newly begun pointer type, it won't work. One solution I had in mind was the most direct and simple one: just malloc an array and access it's members. But for that you'd need to pointer-cast back and forth... No optimal and far away from declarative. However, it seems the only possibility and may work efficient when tested and used carefully.

Oh, how I love the "dangers" of those bare metal programmings! It's the point where your inner nerd can show how a clean mind can stay above the course of all restriction-induced matters.


Mind fuck

I just rewrote my node/linked list entry classes in C syntax and it became way cleaner to read, let me realize that certain commands weren't necessary and how to simplify the overall API. You see, C is cool! C++ is out! Plus the fact that it's easier to structure in my head. I learned to think like a computer - I don't need all those fancy object models! OOP sucks! However, good simple C code (not beeing confused with good complex C code) is either completely value-oriented, plainly memory-based or structure-based with a set of functions for this structure. So the latter way is simply like EVERY normal object-oriented layout with methods and data members. Virtual tables are simple function addresses and not more. I can't believe how there are still peple believing and praising how simple object-oriented programming is. It's a matter of what you told beginner programmers is a good way to structure your program. Or let's correct this by saying that it depends on how they prefer to design their programs. I learned using QBasic and RPGMaker first - procedural and event-based programming (even if the RPGMaker2000 was quite limited). I also took a look at object-oriented programming languages, but at the age of 12 I wasn't really able to understand the concept of object orientation! So where is this "easier programming" now? For a total beginner, it's always easier for him to grasp that a computer needs to be instructed by commands to do something! This is how reality works and how you instruct people to do stuff. Alpha and Omega, you know... whatever. And therefore, if you don't try to be a lazy bum who doesn't want to learn how somethings works (you'll ALWAYS need instructions for your program), you'll always the most easiest and most comftable way to program by knowing (and telling) in what order stuff has to be done and how. It's a matter of facts and computer basics. Interestingly, even those Java programmers who don't like non-object-oriented languages create code that's totally simply be done in C or so with exactly the same amount of tech behind. The difference? Well, you need more code and restrictions for your classes - one way the languages tries to force your programming style to get right. If you can't do this on your own, I think it's totally fine. Otherwise... good question. Do you believe in what you're used to?

Download your personality here

I saw the first half of an interesting documentation about findable informations in social networks, forums, Facebook etc and guess it's a bit more difficult to find my content using my real name and maybe a nickname. The sole nickname would of course bring one to my sites, but also not to my block as I've disabled is beeing indexed on Google. Well, maybe other search engines would find me. Who knows! However, I tend to only spread my philoshophical and nerd-related activities like programming rambles, Lego gadgets and so on. Well, I wasn't really able to connect my real name with any of my internet presentations and really didn't find anything about it. This brings me back to the nice fact that this blog is read by nobody but findable by the guys knowing, for example, my deviantART page. I didn't even mentioned my name here, now do I believe that one will ever this blog using google with further information about it's address or even name. Though... shouldn't take this one so seriously... Guess writing completely senseless and disconnected things with some more or less valuable informations here and there does bring more than only a thin layer of translucence.

All in all, I'm glad to that this blog can always be kept as an almost completely reallife name-disconnected representation of stuff nobody should know in the meaning of beeing absolutely NOT useful except you're after philosophically enhanced nerd toyings.

2 out of 6

Today I had a little talk with the lecturer who wanted to get my a student job, which was quite clear and revealing about what he had in mind what to do. All in all he's an associate of a company developing video streaming, monitoring and image analysis (he does his part as well). For example, capturing videos in a bus, analysing how many people are sitting in this bus and messaging a control center to send another bus cause the previous one became full. They're settling on a new target platform (some Texas Instruments all-in-one chip) and will this would need to make benchmarks, move their tools, test how suitable it is etc. I don't have much realtme experience with system other from a standard destop PC and the NXT, but I tend to read and think more than I currently need, so I can get the possibility to utilize my adaption skills in combination with a (seemingly) complete documentation about the chip, it's preinstalled system, etc. So since they use Linux and thus a lot of Open Source software as a base, I don't have to get comftable with "propriority scum" and so on. However, depending on how much and when I have time, it will either be a "home office" kind of thing for testing the tool chains, creating software for it etc or getting a real job (which would be extremely cool) during term breaks etc or one complete day a week. However, if it's going towards getting hands dirty on the actual hardware, I'll probably get a place in their offices and work with them together. Also, he's always clairaudient if a student says he prefersto program in C due to general shortcomings in his area.

So, hooray! I'll probably take the time during term break (There's only on left til last semester) cause I tend to put as much effort as I can into assignments, student projects etc. That way I can fight my general boredom during non-uni periods and follow my work and personal projects during normal student time. Yay! This sounds veeery promising. Finally I get into interesting areas of those normally "hidden computing" projects. I prefer not creating those desktop apps we have in normal user environments. I'm just not such a WYSIWYG/user interaction type of developer. I like it direct!

However, this motivates me to stop using all those shiny C++ thingies and try to dive more into C programming on a toolkit/library-like level. I hope this speeds up development a bit and makes it less of a psychological torture for me to work with object-oriented concepts (actually, I never really worked with that, but that's another question...). I'm interested in developing some alternatives to C++ templates in the way I noted in my previous blog entry. I bet I can get it simpler and way more modular done than with C++ or so... Yeah, that's something to work for. Atleast I do have time to keep my mind busy til term break, when I want to start with actually working with the company's team or even only with the lecturer.



Yep, it works. Quite efficiently. Didn't knew about this one before. Hm, I think this can be used much more efficiently for more interesting things! it really must be the way C++ arranges it's member, by simply nesting the structures... Yeah, sounds reasonable for a compiler to do. Guess this can be abused to force some kind of method-less object orientation in C! Could it be that those OOP systems for C work the same? I bet so. The roots must come from somewhere, so it's no surprise this way sounds sooo convincing this time.

So yeah, hooray for data structures in C! I think it's time now to port this work to C if not even continue the work there. Atleast for such storage types. For n-dimensional vectors... Well, I haven't yet found a good solution for it. But I bet I can use my macro collections I used to teach the preprocessor adding, subtracting and multiplying. All in all I just need to be creative - this way I can do all kinds of things with it... Sure, it's not as comftable as with C++'s constructor and so on. But it's surely an effective reduction cause I don't use any C++ extension except a bit of syntactic sugar here and there (templates, references and some other things - just sayin'). So I'd rather like to see my compiler completely written in C instead of those painful C++ template classes I wrote. You have to use sooo many long lines for all that stuff... It's also not as clean to look at. Yeah, leave C++ as what it is and try to hack your way through the typical standard. I won't start with those "C++ isn't installed everywhere", which isn't true. Almost everyone does have it, but it's probably not supported everywhere! As I hope to get more into the lowlevel areas program later, there's definitely a bunch of platforms with no or only very limited C++ available. So let's turn the lights off and get dirty.

Stripped down linked list in C

I found a special solution for how to implement linked lists in C without sacrifying the format freedom of C++ templates. I've also seen generic linked lists as a combination of the link informations (all connected other items in an array or more primitive list), so it's no wonder that I haven't seen that linked lists are defined by their links and not by their data! Take an array for example: it's data is sequentiell, the indices are simple ints and you know that the next or previous element will always come in a fixed format, place and formula. A linked list on the other side can't only be described by their data, but also by a set of pointer to other entries. It's a rather a descriptor of a layout and not a data storage. So, what's left if you strip down the data part is the sole list navigation made of pointers. When I was programming around with some more generic list variants than my usual doubly-linked one I noticed how useful it would me to simply design the link layout instead of connecting it to data. Yeah, that's rather something theoretical, but it takes away some functions one would have to add otherwise. Writing linked lists is kinda annoying and doesn't make any real fun. It's more interesting to dump stuff on the harddrive, whatever. However, this is a point where I'm thinking about also moving this set of classes to C by providing a similar interface. Normally I would have let's use inheritance this time to add an additional data element to the item structure itself. But there's the problem you get in C then: where to inherit? Where to type the data if not static inside the structure? Well, that's a real conceptual problem... Of course could I simply use a void pointer like glib does it, but I find that rather ineffective for builtin datatypes (Do you want to malloc twice? I don't). So I started thinking about abusing memory alignment: What if the navigation structure of entry is stored before the actual data and can be seen as an offset addresse before the actual data? May it be possible to abuse the C standard's data layout definition to simply calculate the data parts offset and use a special get/set method to access different types? Well, that's not that easy. There are several things preventing this, especially the fact that C/C++ both don't give you the exact structure as you've depicted it. It's easier to imagine when taking a CPU's register sizes: you have several registers that can work as machine words, half words, bytes etc... You can split most of them and put stuff in it in whatever way. You therefore also store local or loaded structure variables completely made of registers if your structure layout complies with the possible registers and register splits of the target machine. I was surprised when I found this out! It's a pretty effective way to avoid RAM usage for temporary variables, though it'S probably not as useful when too many temporary variables get used. I don't dare complaining about it not beeing exactly as I specified in the code cause i know that it's more worth to store such data in registers. So it's not exactly possible to combine it this way. BUT - and that's defined by the C standard - each structure doesn't have any changed arrangement (padding) for the first member! So, choosing a list navigation structure as the first element of another structure should possible force the compiler (if it's standard-compliant) to keep two seperate structures, each differently aligned for them self. So, a getter functions could be used to cast the pointer to the element to the appropriate custom data structure! So you get the new layout depending on what you cast and as what you initially allocated the structure. It's the only possible way I can imagine for C++ to keep it's inheritance system working. Parent structures need to find their data and child structures, too - even in C you have to guerantee that ever struct X inside struct Y has the same layout as a single struct X. So yeah... I did it! Of course, it's not really type safe. But whatever, C programmers have sacrifice a few things sometimes... However, it makes me glad to know that double malloc isn't necessary. I'll write down a quick prototype of it to see whether this actually works as expected... As long as it only works with properties defined by atleast C99, it should work exceptionally fine I think.

From behind

I read a bit about the issues for using ZIP compressed files as Open Pandora application distribution format and noticed how somebody wrote that ZIP files get read "from behind" when unzipping. Some minutes later I knew why: ZIP uses compression per file and adds a file tree information block at the end of each file to ensure that the u uncompressor knows in what order he has to interpret the read files. Why from behind? Wouldn't it be easier and more save to put in in front to ensure that incomplete archives can still be read somehow? Actually, no. The secret lies behind the fact that while compressing you don't exactly know how big the output size is. So you can't say for sure how long the compressed file chunks are and a previously written file tree with block position inside the file is impossible to create. So it simply compresses them one after another, puts them into the file and appends it's file tree after all files compressed. That's quite genius! Of course not ensurable for damaged files, but it has the advantage of beeing able to write data without knowing how it will become and restoring it later much more quickly. I was annoyed by how one has to always know the size of a linked list to make it later possible to read it into an array. That's over now! I can simply dump it all into a file and count them during this. Later I write the size at it's end and the reader can read it backwards to know the size before he has to generate a list and then an array from it. It's way better to immediately know the size, especially for data that doesn't change at all later. So I'm thinking about making a theoretical design for writing data dynamically and then reading it statically. It's not much of a problem, but goes well together with the idea of bringing more structure into my way of handling file format. I used to only use text files for communication/settings exchange and binary for temporary (or permanent) dumping of raw data. A simple three-letter header for verification and error while reading. That was always enough and I never had problems with it. However, It may be a good idea to think about something more... archive-alike. Something to order huge and small blocks of data, binary and textual, sequential or hierarchical... All read from behind to enable quick writing and quick reading in any case. I guess I can ignore that thing about damaged files cause I assume to not have them damaged. This format will require either require a two-way signature to a) ensure quick format check when reading from begin and b) from the end. So both signatures need to be existant and correct that the loading will occur. And storing everything in a hierarchical tree makes it possible to scan through the data, check for the saved format etc... Yeah, that's another component bringing me closer to some more file-based processing. I like creating those super-multifunctional ideas and concepts. It gives me a clue about what's a good format in many ways and how to reduce required imformations to process data. Especially in terms of trees, files or linked list I'd say it could take endless to iterate through it.
The bad thing about Steam is that when their servers don't work, your games don't work.


tidy backoff

I like the new Gnome version I'm with, but it's simply sooooo frustrating to find another GTK3 style that isn't boring to look at. And with boring I mean those super-smooth shiny "Human" and "Adwaita" ones. I just don't like them, they deter me from enjoying human/computer interaction (yes, I AM that geeky). So... do I have enough alternatives? Yes, I do. Do they make GTK3 apps use GTK2 themes? No, definitely not... Or atleast I didn't test it proberly. However, this also means that I'll need to change some bit and simply try to get aways from GTK3 as long as there's no wide-spread theme support for it. I had so many nice! And now they are useless. Damnit.

As far as I noticed, the main thing why I use Gnome is due to it's standard applications, it's browser and the nice interface they give their oddicially supported projects. Problem is, most of them don't work if you edit all system settings the Arch way - by hand. Either it's the new Gnome version of I'm just too stupid and fixed if it comes to that. Well, I don't think it's good for me to always do this using GUI tools, you know. Or let's say I can also edit all settings by hand in text files (which is undeniably the most effective way of modifying AND learning about your system configurations). Most other things can be replaced then etc. I won't remove it, but install an alternative window manager with a custom set of applications until there are more ways to customize stuff. The cool about using Linux is that you can remove your complete GUI and carry on with coding on another or different window manager with no difference (provided that you know how to set it up properly.

indentation styles

While reading a Ruby tutorial (screw me, I had to do it this time) the writer mentioned "Whitesmith style", which was totally alien to me. Some more informative Wikipedia pages later I knew that this stuff was an indentation style for C/C++ and alike languages. It's interesting to read about it cause it's one of the things I can't really force myself to do. I'm more of a "stream line" programmer who styles as he codes. And depending on how codes (I code usually always the same way) the formatting changes. I have my own way to format code, mostly to save screen space and bring as much information as possible. This also means that, for small functions, I use completely horizontally aligned code. I've never had problems with codelines this way! Well, depending on you indentation style, you have more lines or less. I learned how well the most minimal amount of required operations goes together with condensed coding styles. However, I currently fortifying what I was already doing and also started to use tabs instead of spaces. Or let's say spaces for normal command-per-command code and tabs for indentation, prototype alignment and so on. Fixed width of 4. I'm also quite a bit fascinated with LISP-style comments I have to say. It saves even more space and you can read code by indentation instead of those usually extremely dominant empty "}" brackets. Yeah, that sounds like a cool solution. While coding my improved linked list variant, I'll try a bit. Only problem is that these are templates and so small in code that they fit in one line... Too bad! However, I have some projects this semester, so I can test them out as I want...


Back in time

So let's wash all away the philosophical bullshit and return to something more practical than the universe. Terminator! No, not the movie. When I was using Crunchbang, I had this lovely terminal emulator called terminator which was able to run several shells in tiling window manager-like manor. And it was awesom use, very efficient! However, my graphics was total garbage and I got quite pissed at this point. So it did never again come to mind until now. I didn't thought that Arch would actually have a pre-build binary for it... You know - cool software isn't always available on every distro. But to my surprise it HAS a binary of it! That's simply awesome if you ask me. So installed it and was even more happy to see how incredibly smooth and ergonomic commands it has. Just hold CTRL+Shift and a letter to access functionality. E and O open horizontally or vertically, W closes, P/N navigate and usng the arrow keys you can resize the tabs. That's enough you need! Completely eliminates all usually necessary mouse actions. Combined with tilda for everyday GUI usage, this is a very lethal combination. I'd LOVE a completely window manager based on this control set... Well, there probably is. Whatever. Now I only need to find the "perfect" terminal editor to replace all the usual GUI stuff I most of time depend for editing. I can remember those painful lessons I got when using Crunchbang for too long on a graphically weak machine... However, I get more motivated when I work a terminal for some reason. It's probably because it's a pure and fixed environment which doesn't change - a reliable and never changing environment perfectly for the minimalist who rather prefers an efficient functionality and not graphical and colorful stuff? Well, I like making graphical colorful stuff for some reason... So it's probably a compromiss between having the product nice, shiny and quality but leaving the development environment graphically poor but functional. Yeah, that's something that sounds like it would come out of my head. Once again, I'll tackle the quest for finding the perfect editor! I hope that some changes were made in time, so I can either find a new one or... Well. If not, I'll need to learn about those ├╝ber editors like VI or Emacs. I know I'll eventually face one of them. It's simply fact cause I know how well-thought they are for the time they popped up. Using one of them brings me closer to a seperation I noticed is good for me: shiny, gameplayee world of graphics, youtube videos and video games vs. a focussed and concentrated development environment to not let the developer's tools glow but the resulting product (or in my case: code). I've coded far too much useless apps in too many uselessly graphical editors... Often I see myself wondering why I can just type commands for searching or doing some more abstract activities. Maybe it's time to take a closer look at VI oder Emacs? It sounds convincing in theory, I have to say...

Edit: Oh, I totally forgot WHY I didn't use any terminal editors for so long. Mainly due to the lack of selection utitlities and so on. But I haven't yet checked out the new version of NANO, so maybe I find a nicer way around coping with this problem. I also noticed that most features I missed in terminal editors solely of those things that made it easier to me refactor/restructurize my code. I used to do that a lot - I still do because I want to keep up a consistent style along ALL code pieces. But seriously - I should consider relying on documentation tools if I want my code to be sorted like a table. I'm glad I wrote my static n-dimensional class... I has so many lines and things one would need to rewrite then... Well, leaving that aside I think using nano over GUI editors from now on is an interesting way to force myself to think more about the code affect and function/feature than about making it look completely aligned. Old code is old code and won't be changed all the same. I did so many little classes which I'll never really extend but rather rewrite... Maybe it's just a lacking goal. I can code in any of these simpler editors if it's about getting things done. So it's all rather a detail - I love details! Naaa, simply fuck it and use both depending on mood.. That makes thing alot more... flexible.

Life, universe and everything

Holy shit, I'm through with the universe. It hit me after thinking about n-dimensional tree data nodes when I realized that everything in this world can be displayed and explained with this dimensionally layered/nested system of things I posted a day ago or two. Taking this as a base, did you know that time traveling is only possible when finding a sub-dimension that's equal to the point of time traveling but makes you do different things then, including yourself realizing that it SEEMS like you've traveled back or forward in time but only got extinguished in body from from the other dimension you belong and then pop up in the other, predefined dimension where this exact copy of you knows what you know in the other dimension? Further more, did you know that this ultimately proves that time traveling is not possible as we think but form a construct of n parallel dimensions which are (on atomic level) build like they fit together as a gradiented travel between two dimensions? Even better, this setup doesn't even REQUIRE different dimensions but depends on a completely predefined dimension where it seems that one time traveled, but only (randomly) appeared from a predefined time with all the knowledge one would have if you could time travel and came back. So, time traveling is exactly in hour dimension and explains a phenomenon of series of random destructions and creations of material that either appear to be changed by the happenings of another dimensions of, but are in fact only circumstances predefined by fate (where fate is seen as the complete series of all events happening in all dimensions down from the current one). Speaking of that, did you ever realize that the number of a dimension only represents a relative offset from a certain point nobody knows but uses all the time? That inventing the set theory, vector, matrices is a prove for how one can find new dimensions down and up depending on how deep he wants to dwell? So, dimensions are endless in upper and lower numbers, making it effectively possible to explain that there are dimensions BELOW 0 and therefore that even the pointiest point has dimensions inside we can't get cause we only sense a specific (think?) a set or range of dimensions which require us to declare new ones like we did with set theory in mathematics? That's what currently spreading inside my mind beside the realization that every thing has a consciousness that consist of atleast one kind of memory cell (whether is a brain or electronic memory cell). Oh and wars and computer multithreaded/multicored systems are essentially the same as they have the same behaviour and activity layout. It's not only limited to war, due to the previous definition on consciousness it's also possible to interpret a brain as self-programming program with a specific, gene-defined seed on the beginning which is basically what one does when coding a selft-altering LISP program. So you can't trick the fate as everything is predefined in an endlessly dimensionized and sized arrays of only memory cells making up something that is not a program, thus has no conscious without a definition of time and represents a non-changable circumstances where even the try to change fate is a part of the circumstances and cells defined by fate (it's the complete and gapless set of all definable states, which's number is endless high and endlessly nested to to cover and all possible combination inside infinity^infinity). Ultimately, this tells that a consciousness is a special fractal-like (maybe even self-similar) pattern within a range on a dimension due to it's alghorithmic, self-cell-changing behaviour.

All that in just two hours of my lifetime and everything in the world can be explained with the same principle. We're just a small part in a predefined construct we interpret as space, time, material, events and more absurdly specialised ideas of what's behind the almighty system we are a part of. So that is the ultimate answer to life, universe and everything. Sometimes I wonder if that's also what floats in people's minds when they smoke a joint or two while talking about philosophy, physics and computers. If so, I don't really need it. I found the (personally) best generalized answer to everything that tries to structure the universe somehow. I don't intend to combine it with existing physical models or even make something more scientific out of it - that's all just a personal interpretation to fit it all into format understandable for a weirdo programmer's mind. Hey, that's almost a good base idea for founding a sect or so...
I'm really glad I sorted this linked list problem out. Now I have a fine, very atomic Node class with n mono-directional links to other nodes. Hooray! And it works as expected: a fixed format gives an ordered system like the one used in doubly linked lists. Lemme think... are there more format than usual? One could create VERY exotic ones, though these suffer from dynamic deletion and insertion as all singly-linked nodes do. But with a fixed format consisting of a multiple of two for creating a bi-directional link... Yeah, that's more likely. It's cool to be finally beein able to design the most weird concepts ever on this base. Did I say how much I like those lowlevel bases for later stuff? You could even say, I'm trying to perfect it. However, it comes as it's cost (time and inspiration), but I really don't care about it if it's my personal freetime I can spend for peace of min purposes. I can savely say that I'm more pleased with programming such stuff than I'm with programming highlevel applications using premade code where the fun is already gone! Yeah, it's totally fun to me. Nothing beats the awesomeness of a completely and purely proven technology that's by the way able to create everything else, too.

Call me memory messiahs, if you want.


Battery pack

My NXT rechargable battery finally arrived. It's quite creepy how quickly one tries to finish his tests with it running on battery. I highly recommend buying one (nature will thank you). But it comes at a pricey level... Especially cause these fuckers force you to buy a cable AND a battery to get it work, making effectively 82€ for a rechargable battery. I bet this could've less and combined... Whatever.

great idea

I got a great idea for how combine singly linked list (one possible connection), doubly linked list (two possible connections) and those with less (uhm, is that even possible?) and more. Most really good combination designs rely on long observation of all elements to combine and a good day for the observer. So I was once again observing possible generalizations of some necessary memory-saving structures (or classes if you want, it's the same in this case) for the my nxtOSEK experiments and discovered that you can of course also write every list item with n connections as some similar to the statically multi-dimensional vector I once designed. So you have N existing pointers which, depending on what kind of you want,
will work as singly-linked, doubly-linked or whatever linked lists. You just need a fixed format, for example pointer 1 is previous and 0 ist next. So all previous and next items have the same format and work completely unit. The best thing is that you can combine that in a very useful linkedlist format for all kinds of fixed-number links. So I finally found a good generalization for that. So there are those vectors for "statically sized arrays" (not really, I know), normal resizable arrays, lists with a static number of connections, list with a flexible number of connections, and... yeah. That's it. Everything else is a combine storage type and I'm comftable with creating these later when I finished this one super-functional item class. Though... I should create a list of it before.


Some first complaints

So after using Arch for a while, there are of course a few complaints I have. The harddrive spins down quickly (can be set differently I guess), some programs simply have to be installed manually etc. Step by step I develop more into a home directory-centried user. I have my own environment variables for custom binaries, my bin folder etc. It's not a completely bad thing. Of course, nothing beats the comfort of installing apps with a single command and having it user-independent. But as the sole user of this system (won't change), I can also download it to my home folder in case of missing availability. However, it's not a nice solution and I prefer not doing this. It's ok for testing, but better I either wait for the programs to appear o pack them by myself.
A problem not exactly related Arch is that Epiphany 3 under Gnome 3 doesn't work with Flash. You can install whatever plugin for whatever browser - it doesn't work and you can't do anything about. I tried to install the Flash player manually, but I already installed it using my package manager, so it doesn't make any sense. I most use Flash to watch videos on Youtube and other Sites. I wondered whether HTML5 does the magic for me if I ask friendly, but well - seems quite impossible to do anything with this. So after looking through some threads in the Arch forums, I discovered an app called "minitube" which is quite the coolest thing ever to watch Youtube videos. It gives you a complete and clutterless interface for viewing and download videos from Youtube including Playlist, related downloads, resolution settings and so on. Simply THE application you need if you like watching Youtube videos but can't find working support for your browser. So I can know comftable watch and download it! Awesome, just awesome. Too bad this doesn't work with embedded Flash videos. I always have to inspect the site element and copy it's Youtube link and inserting the found channels or video names into minitube. It's more work this way but atleast I can watch without all those stupid browser distraction. Totem has als o a Youtube plugin, but that crashes too often and is generally inferior minitube. So yes, I currently don't need Flash. Too bad no other embedded Flash player will work, too. So I have to look around for another program for listening to lastfm...

By the way, I'm currently using the drop-down terminal "Tilda", a console you can invoke and hide by pressing F1. Good stuff and customizable. Way better for compiling and debugging than a build-in terminal cause you can access and hide system-wide. Takes only a portion of the screen, very nice to work with. But so far I feel better using it than I felt using Linux Mint. It's not as "helpful" as other distributions, but highly customizable. You don't have so many thick extra layers of strange and useless deamons in the background. You can directly react on your problems and pack from the back. Quite useful, I have to say, indeed.

Oh and I found a great replacement for gedit: medit! Yeah, the name doesn't show it's differences at all... It's like paint vs. GraphicsGale. Way cooler and more functional than gedit itself. Yay, I can finally more away from it. A good environment needs constant improvement.


It's all the same set

Out of a sudden I started to program some random, memory-iterating things for "fun" (yes, I'm that nerdy) and remembered a TV show about dimensionality, theory of everthing, space curvatures etc. It was really interesting and began to start into a "what's when we turn back the time to pre-big bang?" sort of thinkage (which was even more interesting). Physiciasts are interesting persons. They try to put reality into models, want to prove it and go back if it didn't work. They follow a simple philosophy and don't dare to repeat experiments over and over again. With that in mind I wondered whether it's possible to apply the idea of dimensions and set as subsets of other sets to programming and code generation. It begins with simple things: a number of states and thing that is one of these states, a variable. So we have set and one of it's elements which we don't know. So there's set and element. Interestingly, a multidimensional vector is a multiple of elements from a single set whereas multidimensional arrays is a set of vectors from a set of all vectors. You probably know that stuff from mathematics: matrices and other relatived things. But that's just all based on states and fix-sized sets. What about alghorithms? Commands? Executions? What if we can also classify sets of alghorithms as vectors and matrices and matrices made of matrices? Let's take a string as an example: if you want find the exact end of a string, you usually iterate til \0 and know the address or offset. That's dimension 1 - n characters terminated by \0. Let's abbreviate this with a regex-like notation, "?*0". So if that's dim 1, then dim 0 must be "0". We have dim 1 and dim 2 - the rest is induction with a different ending character. So if dim 0 is "0", dim 1 "?*0", then dim 2 could be "?*0?*0" or even with completely different set of ending characters: "?*g?*4" and so on. I already tried to get a system done using C++ that works this way, building dimension-oriented setups of structure verifiers, arrays iterators etc. But I would need to specifiy all single cases at one, which's only limited the upper levels of infity due to random combinations of alghorithms etc. If one wants to write all combinations of twodimensional vectors in non-variable notation, he'll never end. So when I want to create fixed alghorithms for all possible situations, I'd end with a never-ending task made of alghorithms I'd probably never use! So what we need is a system that eases us the creation of code during compile-time we can abstract a special n-dimensional set of actions. An alghorithm ist just a set of values, like code. I know, that may sounds super-silly, but it's actually a quite useful thing. All the time someone writes a nongeneric program, it's functionality is the same and can only be seen as a single instance of a somehow defined, completely set of instructions. If you give your computer language the ability to not only describe those instances but their rule of creation... you'll effectively have the seed of all possibly alghorithms resulting from your generalized description. You can find limited implementations of this in C++, Java, etc... Templates, Generics - they all give you the possibility to do all this, but it's unintuitive and not efficient enough to complete solve problems based on seeds. This is an import realization for me, as it draws a line between mathematics, physics and programs. I see a strong relation between them on this level - everything is a description or alghorithm or both. Those connections are everywhere, not only inside a fool man's head. Instance and Definition, abstracte and concrete. Two opposites relying on each other, beeing impossible to exist if the other is non-present. These two elements can be found in anything spritiual and scientific. Genetic is defined my genes and their resulting animals, religion praises it's god's plans before and spread the stories about what happened after, theoretical physiciast analyse the present and try to find the rules it is made of. Even we, as humans, depend on how we were made and become what we are now. Begin and End. Alpha and Omega. It's noticed by so many peoples in the world - and then most programmers don't even TRY to think about incorporating it. That's the problem with this world. Everything is so focussed on now and then... nobody's looking back far enough to tackle the root of everything except some brave, revolutionary phyisiciast and deeply spiritual people. I (almost) feel like I need to play my part this time. The world of programming needs to ascend and realize it's place in our reality and not only in business problems. I'll dedicate my life to advancing the perception of those connections in a programmer's world. No matter the cost, time or nerves spend. This is the shit I live for.
So far everything went ok. The new Gnome is different from most flavoured Ubuntu variants I tried, but it gets the job done. There are some annoying things like no setup for keeping brightness max when running on battery etc.I don't like to adjust it every 5 minutes I don't type... However, it's VERY simple it it's overall functionality and appearance. I rather prefer using the Fallback mode cause it doesn't (or atleast it shouldn't) utilize those shiny and battery-sucking graphics effects. Oh and I'm pleased to read that mutter (it's new window decorator) is compatible with Metacity themes. So I can install all decorations I want with touching it's admittedly ultra-thin window theme.

I only need to import browser settings, install missing apps (there aren't sooo many) and whatever comes to my mind. It's unbelievable how much crap is in Ubuntu distributions. Nobody needs this...

Ahoi from Arch

Heyho fellows out there! Not that someone would read this, but I've successfully setup Gnome 3.0 on Arch Linux and it was easier than I thought. Arch has a great Wiki I've already utitilized before. It's damn easy to setup and customize if you're already a bit into Linux via Ubuntu or similar. Combined with pacman and Arch's build system you can install and build whatever you want without having to worry and outdated software. It's simply awesome.

And Gnome 3.0 looks way better than I remembered. Ok, I have to admit that I used a test version and tried to combine it with an existing installation. The new one looks amazingly cleaned and not as static as the typical Gnome 2 desktop (seriously, al this clutter got annoying). So I'm looking forward to only install the stuff I need for what I do - no more stupid service thingies, no distribution-based "extras" and no other shit obscuring my system! That's great about Arch: you can do anything you want with cause it's rather minimal. A few package programs, some bootloader configs... Nothing else. You get the bare bones with a all-purpose kernel and no problem.

Oh, how I love this new toy!

Arch Linux

I dropped the idea of tinkering with those tiny RAM-only dsitributions (they generally only work well if booted from CD or USB) and spontaneously installed Arch Linux. It's a minimlistic system with nothing but the essentials of what makes a Linux/Unix system. It's quite fun to tinker around this way, but soon I missed the comfort I was used from GUI systems. However, it's simple to navigate and configurate. I even had fun using it without any interface I was used to. After I installed a few basic things included a pacman database upgrade/update and Arch's build system, I immediately thought about installing Gnome to server my habits. But I don't think that this will improve my skills with using Linux or will help me to change way I motivate myself. I know it's necessary to install graphical tools for simple interaction and browsing - but I felt so free and clean when writing some test programs using nano... There is nothing but code and you can concentrate on what you actually wanted to do. A purely textual environment is way cleaner and doesn't make me think that computer systems only consist of colorful buttons and animated popup windows. It gives me back the simplicity I always missed when I started using more direct ways of development - text editors and commandline compilers. I even considered taking a look at vi for more direct editing than pressing a thousand of key combinations to move a line from a to b. Focus is all to bring me back from rather depressive phases. So whatever comes in my later life, a window manager and associated programs should not be necessary for me when programming. This brings me and my workflow closer to efficiency and simplicity. All those other developers or lecturers can tell me what they want: development is a spiritual thing. Focus and unity with path, goal and peace of mind.



While I thought about how nice but lonely this one little nxtOSEK guide looks on the left, there was a thought running through my mind: do I have coding guidelines or development philosophies? Well, yes - implicitely. I never wrote them down and tend to adapt them to the team's style I'm working with. Although I prefer the "perfect code" for private projects, I never though about what kind of principle lives behind that. So as always in case of doubt, I wrote a few lines about thing I like and things I actually do. It happens that I praised the things which hold me back from getting productive. I have this massive melon on my shoulders, but nothing in there tells me that developing the stuff you're working on is easy to combine with what you as an "ultimate" philosophy. In fact, I praise purity, efficiency, simplicity and generalisation. Seen as single components, they give solid guidelines to follow. Purity is also consistency, efficiency is never wrongl simplicity makes your life easer and generalisation brings it's custom flexibility. But all together? Quite a dream in reality. These four things are what makes me design new stuff on and on. I question tought techniques, try to improve or perfect them - I do all this to make my mind peace happen and yet I can't follow these guidelines without sacrificing productivity. Out of a sudden I wondered how exactly the famous Unix philosophy was. Shortened to "do one thing and do it well", it becomes something I do when working in teams. I have a task and try to archieve this as good as possible. When it's done or the time is right, I help out others by the same way. Interestingly, I've never given out advices that don't live up to the essentials of those very few and stretchable lines (except in this blog, which is more a valve and freetime thingie). No wonder why I prefer seein the Unix philosophy applied. I believe that I can keep my "ultimate" philosophy goal by following the Unix principles (interpreted of course). The magic lies in "do it well", which is what I shall look out for. Doing something well can range from using a primitive bruteforce approach to implementing an event-base GUI layer system. It can depend on so many things that I'll ultimate be possible to see advantages where my original "guideline" didn't work. It's not always necessary or possible to do something as efficiency as doable. So, what is doing it well, then? Depends on whether it works well with everything else I'd say.

These moments are what this blog is made for: dumping shit no normal person would listen to.

Run from RAM

A good distraction method I found is looking for new Linux distributions. It's always interesting to browse certain lists from time to time and look what kewl thing one might find there. There's an interesting list of distributions that run from RAM, loading their data one time from a drive and start quicker than normal (atleast in theory). Could be interesting to test some of them. Since I only use a few programs, it might be very useful so save space and back up faster (one thing I do too seldomly).

A bit of orientation

I'm currently not in a very good mood, mostly due the information overload. I shouldn't have read through all the OSEK documents. It makes you brain melt for a few days and then you're wondering what happened. I think I should try to reorient my self a bit, play some videogames and clean up the informational mess in my head. Tried to drop some ramblings and catch a free mind but it didn't quite work. Instead, I started working on another gun cartridge in 4x4 format. And well, I did it but it doesn't get farther than the 3x3 cartridge. In fact it has a less flat curve but higher accuracy. A dozen rubbers were tested and didn't served as well as the should. I could theoretically increase the pressure, though this could damage Lego parts or simply make perform bad and bend. *sigh* I don't know what's wrong today but nothing brings back my motivation. Maybe some sugar will help getting my brain back on the horse.

Also, blogger has fucked some categories. Screw you, it would've been better to disable all blogging functionalities while it turned back right.

It's back again

Yeah and I was so stupid enough to immediately rewrite the guide. Oh why does this always happen on fridays.


Shortened guide for setting up nxtOSEK on Linux (and a bit Windows)

The original version of this document was written to help fellow Linux programmers setting up nxtOSEK under Linux. Since the windows installation is already very well covered, I thought it might be a good idea to write some to-go installation instructions and explain how to setup a basic workflow that should work for both Linux and Windows.

1. nxtOSEK installation on Linux

1.1. Requirements, installation environment

First of all, you'll need Wine, GCC and most probably internet. On my system (Linux Mint 9, based on Ubuntu) I've installed Wine 1.2.2 and GCC 4.4.3 as well as APT for installating required packages, so I can assure you that it works quite well using this combination. You don't need to build all tools except the compiler, there are already binaries available.

1.2 Download and build ARM GCC

First of all we need to modify an existing GCC compiler to compile for our ARM target platform. Install these packages using your favorite package manager:

tk-dev ncurses-dev libmpfr-dev texinfo

and download + execute this script, it'll build you one. Better bring a game or book, stuff could take it's time. Place it in a directory you can remember later ("~/bin/gnuarm" in my case) cause you'll need it's path in nxtOSEK's settings.

1.3 Download and setup nxtOSEK

Here comes the step most forget to do. Download not only nxtOSEK itself from the download page but also the OIL parser/generator "sg.exe". Currently, there's a yellowish box below the download links explaning where to pug "sg.exe" in the main folder. If you don't put it there, Wine won't be able to do anything except dumping useless error messages. I'll describe the OIL thing in 2., so stick with this at first.

In "/nxtOSEK/ecrobot/tool_gcc.mak" set "GNUARM_ROOT" to the folder you put your build ARM GCC compiler to, this tells it where to look for the binaries when building.

1.4 Replace original NXT firmware

It's described quite well on the official page, so simply follow 3b there. If you already used NXC/NBC with extended firmware or leJOS, you've probably already found a way to flash on your own.

1.5 Compiling and uploading

In "nxtOSEK/samples_c" just pick one and execute "make all" there. It uses a special makefile you can use later with some minor modifications. If everything went right, there should now be an *.rxe file (along with others) you can upload to your brick using NeXTTool (step 4b). And again, NXC/NBC users will probably know how this works. Otherwise follow the page instructions or study it's help function. Don't waste your time with those makefile-generated shell scrips, they only work on windows using cygwin (atleast on my system I couldn't get it generate wine commands).

2. Everyday compiling, setting up a proper workflow

(The following instructions should be similar on Windows, provided you've correctly installed cygwin and know how to run and edit makefiles/know a bit about bash and your home directory etc...)

If you don't use Eclipse/Netbeans or whatever plugin stuff exists for nxtOSEK, you'll need to compile it directly using make and an *.oil file. You can simply copy one of the makefiles under "nxtOSEK/samples_c/" for C programs and "nxtOSEK/sample_c++/cpp/" for C++ program. Let's take a look at helloworld's Makefile:

# Target specific macros
TARGET = helloworld_OSEK
TOPPERS_OSEK_OIL_SOURCE = ./helloworld.oil

# Don't modify below part
O_PATH ?= build
include ../../ecrobot/ecrobot.mak

TARGET specifies the name our app, TARGET_SOURCES is the list of all our source files. TOPPERS_OSEK_OIL_SOURCE specifies our *.oil file. For everyday programs you can keep "helloworld.oil" as your base and extend it later if you want. It's basically a specification for how the OSEK kernel should be compiled. Every time you change your *.oil file a new kernel will be generated which automagically includes your program. So in the end you get the whole package you need to start and manage threads/tasks/etc in one program. However, I won't explain all the details here. There's a so called "Quickstart" for nxtOSEK which pretty much summarizes everything you'd want or need to know about OSEK and it's OIL stuff. So before you start going wild you better take deeper look at either sample projecs or reference documents.
What's left in our makefile is the the last line which includes ecrobot.mak. It's the part which actually compiles your sources and requires a relative or absolute path. Unfortunately, (atleast I do this sort of thing) most of your source files will probably NOT lie inside the nxtOSEK folder itself, so it's advicable to move the nxtOSEK directory to "~/bin" or whereever you usually put your binary files. So when creating your own makefile you only need to put this fixed path into the last line. If you also move NeXTTool to "~/bin" and append to your PATH variable in .bashrc, you should be able to build, upload and run projects from anywhere with no problem. For Windows user, you can simply edit project files using an external editor and the running make inside cygwin. Since cygwin mimics a simple Linux environment, all commands will be equal (I haven't yet tested it). Alternatively, run the *.sh files make generates for you in your project folder. They are callled "rxeflash.sh", "ramboot.sh" etc. Can't test this at the moment, but be aware of what exactly they do. Not that you accidently flash aways your BIOS.

For specific questions you can take a look at the FAQ.


Great. I wrote a huge rant about Java, decided to correctly setup nxtOSEK for native C/C++ development, got it work perfectly and then added my first tutorial ever about setting it up correctly. And ya know what? Those fuckers made shit with their blogging software and now all post since I discovered got lost. Fabulous. You guys really know how make people happy. No I have to write it up all again and care about proper format. Hmpf.

To summarize the stuff that happened: I got a huge mindfuck by nxtOSEK, studied it's source down to it's bones, learned more than I learned in the last three months and am now able to utilize my complete codebase and put all development focus on creating embedded applications for upcoming Lego projects.

Oh man, I was so happy that I wrote the seemingly only existing tutorial for how to setup nxtOSEK on Linux and now that... Damnit. Atleast I do now know that saving all post on your HD is a backup method to consider when "publishing" stuff in the web. It's always like that. Once you did something good and deleted it from your HD cause it's up, something bad will happen. Fuuuuuuuuuuuuuuu you blogger. I'm gonna make my own page for this. And then you'll whine after the traffic you lost by that shit action.

Anyway, I don't feel bad in a programmer sense. I can focus all development on NXT code cause software rendering the display and genering sound does make more sense there than on normal PC platforms. I think I found the niche I was looking for for so long - there's literally NOTHING premade on nxtOSEK except some driver functions from leJOS and ecrobot and the OSEK OS' minimal API. In the next tutorial version I'll write a bit more about it and will have read it's specifications, the ecrobot API etc and so on. I want to see a complete guide on my future page. There seem to be many people interested in using nxtOSEK (understandable, all other systems don't deliver it's power), so I will share all my experiences and try to popularize it a bit.


Code Slayer

I found a minimal and lightweight alternative to existing IDEs or plain code editors. It's called CodeSlayer and follows as a simple "choose folder as project" philosophy without any associated extra files or rearrangements created. Like a simple text editor like gedit combined with a folder-based, custom project management. I like the approach and that it completely focusses on project management. You can also group projects and have them seperately visible. Nice idea I say. This way one never looses the synchronization between project and filesystem structure.

Yeah, it's not much to say about except that it's minimalism is optimal I think. Only thin bothering me is the missing build-in terminal, but that could be solved by writing a plugin. It's funny how it only gives you a basic outline and seemingly rely on plugin made by others as it seems. Not sure whether there are many guys out there using it, but maybe one day I decide to learn GTK for some reason and write plugins for it. Though... I never really was a fan of frameworks. So probably not. I'd rather learn wxWidgets instead, way nicer and less bloated.

suck suck

Oh boy, I found this one while browsing through coding horror and the man pretty much speaks the truth (atleast from my point of view). Remember when I wrote my Java iterator thingie? Well, I ended having a big bunch of interfaces and things to implement but It wasn't even a tad productive and I rewrote it two times or so after then realising that it would've been faster to no rely on it and simply to it fixed. But well, it's Java and if I'm gonna do it in Java, I'm gonna do it in a way I can say that it's reusable for all of my later leJOS projects. And looking at the mail I linked, I can see some similarities in my own code. I can't say that all OOP-resulted code is "bullshit" (especially cause I know that good OOP comes close to good C code if you don't overuse or completely pervert it like it's usually done nowadays) but I often experience it that most OOP fans simply aren't able to push the right buttons on their abstracted machine. And they tend to hide stuff (one reason why I conceptionally rather dislike it) instead of revealing what they do. Teaching beginner programmers OOP first is something similarly precious in topic. The most horrible example ever of a such an "experiment" gone wrong is my lecturer for component-based development. She doesn't know anything behind her JavaVM (oh, I love ranting about her) and the only thing she does is to repeatingly iterate through her concepts and models and whatever not without seeing that this stuff doesn't work as it should in theory. Her interpretation of "re-usability" only involves abstract models, means you always need to write the code again you wrote before. Is it that usability? No, it isn't. It's talking about stuff that doesn't matter. If you instead write your function set without those models but with portable C it becomes completely modular and more reusable than those model things.

However, all this doesn't help me to push my personal project further. I decided to rewrite as much from the "bad" Java stuff as I can ony my own. Yeah, maybe I can keep this as a playground for OOP, which (retrospectively) kinda stopped from continuing my projects if I think about. I can remember that during my "hot phase" of development, I only used methodes in structures to avoid typing this-pointers all the time as parameters. I have to say I slowly started to think in more OOP-like terms and models and I (along with other reasons) kind of stopped beeing productive in any way. Just look at what I did: writing classes and object-oriented systems with no actual use or relation to the stuff I was working on before. Then it took me a long way of deeper research that it's essentially not OOP what I want and that the OOP world totally sucks down to it's bean-shredded bones. I was so motivated before all that. I simply wanted to have stuff done. No I don't have any projects beside rewriting stuff of languages I don't really need or what. It's time to take a break from all this "all object" stuff and get back to the essentials, the fucking ROOTS of code. *

* Though I have to say that I still don't know want to code with something else the NXT. The VM works ok, I don't need to use any cod that's not mine and I couldn't get the GCC ARM to work before to develop native C applications. And leJOS sound is simply great, I have to say this. So once again - shut up and use where you need, go C otherwise. Hm, if I get these simple Java class extensions done, I will concentrate on beeing productive and pushing. One way or another, I don't want to sit all day and make silly models. In the end it's all the same and will prolly work better with normal function calls than overly complicated and time-consuming abstraction forces.

I should print a motivational shirt or so. Some bad Chuck Norris joke paired with programmer fanboyism.


Linux kernel

Today, I and my fellow students had to program a simple Qt example project using it's QGraphicsView class (and all the stuff connected to it) which I somehow couldn't get to work cause of Qt's silly OOP perversion and me beeing not in the know of how to solve it. However, some words back and forth and my lecturer what I prefer to program, what's my domain if not perverted C++ OOP. Well, my honest answer was that I rather prefer to code in C, which is actually quite true. C is, in it's bounds, a language one can definitely cope with, in almost any case. I always was a bit lowlevelish, so I gave this quite honest answer. Interestingly, he kind of offered to pull some strings and arrange me a place as Werkstudent (students working in the same area as they study, parallely), working on Linux kernels in embedded devices and thus precious lowlevel tech made in C and assembler. He was surprised to still meet a student who preferred such non-OOP languages and so was I as he mentioned to know some people who'd interested to get students in their hut. Well, at first I mentioned I prefer beeing a fulltime student, not working while I study. And he really pulled out further alternatives (working during term breaks etc) and said I could write him a mail about if I'm interested. If that's not an offer! I started to think about it: do I really want to do this? Linux kernels? "Hm" was the only thing I had in mind. I only knew the stuff from rather non-covering light lectures about VERY basic and abstract schemes for how the operating system can or exactly handles multuthreading, paging etc (the lecturer was quite an admin, not a developer), but didn't like it so much. I always said to myself that I'm not a system programmer or don't wanna about anything non-portable (between systems). But having around 4 hours free between lectures, I began to think about it:
- It's an offer that's free to take and you can only profit from it.
- Your studies are less fulfilling than you though and you even began to do your own stuff (definitely a sign of beeing under-worked)
- Do you really know something about the stuff that's going on in this area? Can you judge it by the few things you know about it?
- Ever considered learning something there that's useful to you? For example, something that could help to code your programming language with more inside, thus improving the world of software development or even writing you own software-made computer like you once planned it with your own language instruction set?
- Is there any lowlevel stuff out there except compiler that you do instead? Do you actually KNOW any other areas?
- What if you like it and decide it's worth experiencing and seeing you own software in a marketed product?

So all in all I only saw points that made sense. It could be interesting, informative and a reference for my later life. I never worked as a programmer, I also rarely finished something that can be called software as software is nowadays. I tingle around with never-to-be-seen functionalities, stuff that's never going to be noticed by the user. So what about a wedding then? Between me and the more lowlevel areas of computing. I read some pages about in Wikipedia and it seems to be quite interesting to learn about and do by yourself. I'll read more about it, get into the material a bit more and then write him a mail how exactly he imagined it. So if everything turns out as expected, I can either work somewhere during the semester and/or during the term breaks. I need to ask him a bit more about that or the dudes then he'll direct me to.

Cross some fingers for me, that's could be something really worthwhile and improving to me. It also prevents me from beeing too annoyed about the fact that I never get out of my small world. It really is a small world where I currently am. I even enjoy reading Wikipedia articles to spice up my general boredom. It could also go in a totally different direction. Maybe I won't have time anymore to realize my own projects. Maybe I won't need my personal projects anymore. Who knows. Whatever, it could also be that they don't want me cause I have no references, cause I never ever really worked with system functions or any other reason I can't come up with. Who knows. Let's get it coming.


Much better Iterators

Since I fleshed out the details of a better iterator system and even had a moment where it proved it flexibility, technically. Normally, one would either use iterators with a fixed and limited function base or pointers aka direct addresses on the items and then calculating or reading the next adress. Both concepts are pretty fixed and iterator have a more runtime-generic approach that can, with careful design decisions, also be done statically. Used statically and properly inlined they provide equal performance - one thing to keep in mind I think. However, iterator are a tad too generic, as they only have very, very basic functionality. For example, how do iterators link/unlink elements in a doubly linked list? No way you can get that properly done with a fixed set of commands that has also has to work with arrays or non-linked lists. So I put the concept a little further by defining an army of interfaces. These interfaces have a only one or two methods in and mark significant features of the underlying "storage class" (means the way of storing elements) and per-element navigation/operation. I read about so many different posible ways of storing data elements, only a small subset of them is able to work standard method sets like push/pop/next etc. So if your alghorithm needs a storage class or iterator, you only need to make it generic, use the type T and extend it with all the interfaces/methods you want to have. It comes at a cost, though. The most annoying disadvantage is that you need to specifiy each used method on your own (can be simplified by creating dummy interfaces including all other required interfaces). Since it only bases on interfaces and generics, I think it's quite static except when I use polymorphic generics. However, I believe in this concept and see great use for it in all upcoming I may write later. Though it's currently only done (haha, done...) in Java, it's probably quite easy to port it to C++. Guess I hit the nail this time! And implementing it in Java first makes me a bit more comftable with it. I must admit I'm quite conservative when it comes to choices of programming languages. I prefer having both lowlevel and highlevel functionality - only if they are well-designed, though. In this case I prefer lowlevel functionality and mixing my own mojo.

I'm looking forwared seeing this concept in action and designing my renderer with it. Something says me that, on my own, I'm always "inventing" everything again. No matter the complexity. Well, someone has to do it. What would be the world without people like me? A bunch of people without any idea of actually trying to improve existing concepts.


Better Iterators

I got a fabulous idea for how to something better out of iterators. In Java they are rather simple and primitve, in C++ they give much more but don't take it too far. I think it's time to overhaul my "storage classes" and give them a new, more generic touch. I fell in love with Java's Interfaces I have to say. If you forget about the garbage collecting, it's not well-designed standard libraries and memory-abusing nature in general... it becomes a simple system with few limitations to keep in mind. Sometimes you only need a certain degree of stripping the standard library to show me how a language can be. I think the problems it usually have with programming language is that their standard libraries are rather bad. This can be due to historical reasons or just because the developers aren't very good API designers, whatever! That's not always the case. Purebasic for example has an excellent standard library while giving one all possibilities to do everything on your own. C does also have an acceptable standard library. Not overwhelming, but for a reason: it's old. Many other frameworks and libraries also deliver the same functionality one would expect from a standard library. Qt for example. It's possible to develop almost anything without C or C++ libraries if one uses only Qt (atleast that's what stuck in my mind). So, it's all a historical question. I shouldn't be to hard with those things in the future... Language designers aren't always good library designers. And those designs always depend on what the library user is use to or what he prefers.

Hm, this makes me more comftable with the world of programming, I think. Plus I now have a plan to get use to iterator systems. Hm...


Well, since my rendering concept involves a bunch of lists, I had get them into my program somehow. If it'd be C++, it could simply use my existing codebase. Not so with leJOS - there doesn't seem to be any linked list that's truely one. They are using arrays all the time but no linkage. So I asked myself: try to port your own class or invent a new one? Well, it's easy to port my favorised one (I already did, a matter of seconds), but thinking about the the memory limitations and that only have a maximum of 32kb available with in leJOS, it could be worth taking a look at pre-allocated data for storing data. The task I give leJOS is quite memory-demanding. You need to store 4 ints for a rectangle, 2 for a point and additional references and temporary rectangle data. I can see where this could cause trouble with many display elements. So that's a situation where I finally encountered a problem that's more smart to solve with iterators! Iterators aren't specialised, they use an underlying interface and redirect to a certain function. I could've designed it with my linked list, wasting memory as I go and forcing the garbage collectors to take his time for cleaning up once in a while. This wouldn't happen in a C program, but I don't have such things at hand. So to avoid later trouble with what memory structures to use, I consider writing my own set of iterator rules and underlying classes. It always ends like this, eh? Anyway, if I can't avoid inheritance or interfaces in this case, I should use it wisely. Inheritance is one of these things I simply dislike. Interfaces on the other hand are extremely handy for writing more generic stuff. Whatever, this is a bit annoying. Not just a bit, it's stupid. Pah...

No direct access

Damnit, I should've know that. The NXT's native image format is, like the display itself, bit-based and not as easy to access. leJOS' source code revealed some odd details about the format (it seems to be packed in y-aligned pixel packs instead of the nowadays x-aligned packs) which are not really friendly to my head. This means that I can't access the memory as quick as I hoped to. Many operations need to be done to do this and I bet that a C code would've performed better than the current Java version. So either I stick with normal pixel putting and follow the path of minimal pixel drawing, or I devote another day to get a not really faster version done. All in all I still need to iterate using Java on still level. No way that cutting the clip commands would help that much to speed it up significantly. *sigh* So it isn't a viable idea. Ok, ok. Back to monchromatic displays, slow drawing and dithering. I've seen a demo using bitblit functions and prepared xbm images and geez, that was fast. As fast as when drawing natively and I have to tinker with stupid per-pixel plotting. Someone really needs to implement a better setPixel routine....

However, grey shall be ignored and dithering is the new buzz.


I couldn't wait to test some of my ideas for how to simulate states between black and white on the NXT's display, so I did a quick test run by using the same on/off principle I already used for imitating polyphonic sound output. Thanks to the display slow reaction time it's possible to simulate different, slightly moving/flickering shades of grey by shortly display them every n milliseconds. I think I could get really smooth shades with some more experiments, but I don't think it's a good to invest in it. Those alternating black/white pixels require constant redrawing - even with only bitblit functions you can't do this quick enough for big screen areas. It's only suitable for small sprites due to it's slow reaction speed. But I could probably speed the rendering by not accessing the more far away graphics memory using setPixel, but using some of leJOS predefined buffer copy functions. Hm, could solve the problem and would speed up the drawing speed enormously. Hey, that's also interesting for the renderer. If the performance can't be improved, I'll rely on dithering.


I think I'm done with all the theory. It's not as easy as simple bruteforce z-buffer drawings of course. Was kinda tricky to figure out, but again it proved that my the first actually interesting thought in my mind was the best one. I cannot say much about it's complexity, but provided that my "dirty rectangle"-alike set operations perform in acceptable time (which I believe cause they require just a few instructions), it could get a nice O(n) performance in best case (no non-opac elements) opposed to O(n^2) in the worst case with only non-opac objects. But I guess that special case is worth it's own implementation without opac-centered drawing reductions.

However, I'm pleased with the current concept. If everything turns out well, I can profit from a simple and relatively easy to implement rendering. I already thought about possible uses, interface concepts, etc... Hopefully possible with this new technique. Provided that it works as quick as expected - which can't always be guaranteed if you don't test or prototype it.

Hooray! I gonna draw some mockups now.

Writing a surface occlusion detector

As I expressed in my previous post, I want to write an alghorithm/system that renders only the parts that need to be rendered in a Flash-like display list with z order. My idea of only updated the changed areas is solved quickly by iterating through all upper and nested objects and listing them along with their yet-to-update area. Simple. But that's doesn't solve any other problem. Whatever you remove or add to the display list has it's own z value. So if you delete a picture from the lowest layer and it doesn't affect the visuals at all - how to detect that it is visible? Well, my first thoughts were that I have to make a trade-off. First of all, I defined some properties for each object marking their bounding box and the way they depend on the objects drawn behind them (called "occlusion mode" in my program). If, for example, an object gets drawn by ORing the pixel below it, it will require them to be drawn. If not, you can skip any objects below and continue drawing the non-draw areas. I had several ideas how one can do this, but they either results in slow drawing or enormous memory usage - though CPU time is more critical, so more used memory isn't that kind of a problem. But still, my first ideas used exponentially more memory for every new object hiding stuff below it. Z buffering was also no option, so I decided to use a custom system - once again based on the almighty set theory. In essence, it decides between front-to-back (opac) and back-to-front (using previously drawn colors) objects and interprets their surface area as a set of outlined pixels. To bring only the visible pixel parts to live, it uses basic set operations to draw areas step-by-step with the optimal (in case, =minimal) amount of pixels necessary. I don't want to say to much about it, as I'm paranoid about it... there aren't many actually different alghorithms for rendering graphical scenes, I'd like to not see my invention implemented without my name on it. I don't mind GPL-ing it, but first I need to implement it. If everything turns out well, I will have a very set-ish and geometry-based alghorithm useful for virtually every kind of shape. Though I can't afford the performance for other shapes than rectangles, I can think of it beeing seriously mltifunctional. I don't have many of these clear moments where all my previous ideas and experiences come together so well, but this time it's different and it shows (atleast to me).

Anyway, I'm not quite sure about the performance in detail. Depending on how detailed your shapes and overlaps are, the performance alternates. I can't really predict how good the actual performance will be, but definitely quite minimal - especially if you don't use back-to-front objects cause these require a typical "background first, top-most object last" rendering philosophy as you can find in almost every 2D game. However, it should work fine on the NXT I think. It shouldn't go wrong with some carefully designed classes for describing pixel sets optimized for performance.

So wish me good luck for that one, I finally want to say that I wrote a render system that kicks ass.

Transparent LCD display

If was looking for ideas of how to simulate different states of grey/transparency on a monochrome LCD display but ended up reading about transparent LCD displays! Just google it and you'll find plenty of pictures demonstrating. Wow. I think that's dead-sexy. This way, we come closer to the future of floating and glassy computer interfaces. Just imagine the possiblities! Especially for museums, fairs, shielded rooms with only a glass plate for looking in and such. On could use these displays as touchscreens and most things 1:1 behind the screen to operate in a room with possibly dangerous material to work with or something like that.

I'm usually not such a tech fag, but this is simply genious. I didn't knew about. I am - after all - just a stupid programmer.