Actually no need for lists

I couldn't get my ass moving and make function notations out of my item macros. However, the (thus not always working) list macros themself are sooo minimal that there at best only one macro inserted and nothing else. Using the sentinel itself as a list item fulfills everything you would for managing a complete list including all their operations. It's absolutely null pointer safe when passing correct parameters, will have some awesome features I always wanted and it's extremely compact in instruction count. Even GCC's lowest optimization settings are able to shrink the code size dramatically - so yeah, there's no real need for additional wrapping macros. Well, about counting list items and so on... It's no problem counting them up manually. Just set var changes here and there... that's not the world's problem.

Hooray! Now we're getting closer. Next stop: macro stack operations!

Put the ash of your deceased into bullets

Ah, again the crux showed itself when logic evaluation logic became more of a problem. The point is that I need a temporary variables for certain list actions. Not in the item actions themself, but when trying to insert an expressions which's content will get corrupted when it integrates itself into the formular. Well played, preprocessor. Don't know how to solve this without requiring a temporary variable. Atleast not if I want to keep it safe and consist. I find it this especially annoying in the current situation where I don't to care about temporary variables, you know. But heyheyhey... I got an idea. I can create a new scope block using "do { ... } while(0)", which should enable me to create lengthy hidden variable names... One temporary variable name I really love is "_". Nifty, eh? And it really works in most programming languages. However, I should add a postfix to signal it's relation...


I'm actually thinking about writing more things like my admittedly simple stack implementation as a collection of macros cause calling those function and passing arguments is takes longer than two simple intructions. Now that I'm almost done with my macro list implementation, I'm really surprised how well one can write code using them. I'll definitely provide functionized versions of them to reduce compile time and increase comfort. I build in some restrictions like return values, so that it's not possible to generate errors inside the macros except when passing bad parameters. The function can provide safety and ease of use where the macros can be use for highest possible performance and possibly smaller footprint. Good solution, I'm getting closer to the optimum for my target platform.


Cryptic in video games

If I ever going to make a serious game with all those nice open-world RPG FPS story and 3D graphics thingies, the main theme will juggle around cryptic and confusion story telling that puts emphasize on using dialogs and tasks to form the main's game feeling without bringing any clearly understandable story or topic. Somehow the same what psytrance does with random vocals: using them as instruments rather than telling a story, but now in a videogame format using random dialogs full of inunderstandable logic as elements of atmosphere and gameflow. I think this is quite interesting as a main theme cause it enables me to make full use of my ability to create exactly such stuff under certain musicial influences.

Gnnaahhh,,, today makes the nerd in me quite wet I've to say.

E.Y.E: Divine Cybermancy

Today is a good day, no only because I found best linked list implementation ever, but also because a game which's future I wasn't certain about in the past has finally arrived on Steam and can be bought for around 18€ on Steam from today on. The game's name is now E.Y.E: Divine Cybermancy and looks even better than what I saw in the past (though the title's literal format behave equally strange to read like S.T.A.L.K.E.R, seems I always stuck with such titles for a reason). So when I finally get the ability to use Steam again, I can download it. I already paid it, so I don't have to do this then. I'm really excited to see how it plays and what it is about in detail. It seems they implemented they amount of features and atmosphere I always loved to see together. Haven't yet played any good cyberpunk FPS in the past, so I'm really looking forward to it. When I get my new line it will be around 6 times faster than before. Though I live in a quite a big town with all ways of getting highspeed internet at your home, I had to stick around a maximum of 100 kb/s, which's a total to when downloading new games from steam. So I'll probably get 600 kb/s in the future, you can't believe how this is making me! The company told me 700 kb/s would be possible, too, but of course I'd have to choose a way more expensive contract (10€ more for just additional 100 kb/s), so I sticked to the lower one. Well, atleast I'll get what I paid for. Seems that finally worked without any problems. Can't wait to start Steam again, I've been without any or my recently bought games since a month or so I think...


Awesome, I can now operate on lists using only threed macros. That's the most awesome linked list implementation I ever wrote and possibly also the best one. No stupid interface stuff, just what's you need. And creating a complete list basis on having a single item as your sentinel, that's it. It'd almost that it's the most perfect linked list implementation I could ever imagine. Really, it's just awesome. It finally paid off. I love when I get the valuable rewards of a longer research period. It brings everything into a different light, really nice I have to say. I'll only add a few extra features like counting and then some assisting macros for working with non-ring items using sentinel nodes.

And it's getting even better

Seems that macros give me some kind of good environment for brewing new ideas! Seems that I don't need any sentinel node or so cause I can leave all stuff as a ring. Link any lone item to itself and you can connect them in rings with a minimum of code. This makes, logically, no list end, but since every list stores it's first element as a pointer, you'll know when the next element points to the first, marking the jump from end to start. Amazingly simple! You see, the best results come always when dealing with something for a longer time. I made my goal to get the shortest, most generalized and most simple to execute code ever and I'm getting closer and closer. Thinking about my future programming life and what I'll probably due for a living, I don't think it's necessary to "show off" your own code design, but know when what is a better solution and also sticking to it as long as it's necessary. I think through my very general thinkage about all this stuff, I got quite good in it. The more it forms in head, the more I realize how irrelevant it should be for your employer to see what you already did, but rather what you can do for him and how good you're at day-to-day software development. Since I'm quite fast in creating and coding new appropriate solutions, I believe it's the best to don't think about and let you muscles play when it comes to that. Anyway, I have my term break, my student job will hopefully be realized in a few weeks and I can code my very own stuff.

And I'm so glad I found this Wikipedia page about sentinel nodes. I shall kiss this dude's ass who invented it!

Quick decision

I think I'll implement all linked list operations using macros. Macros are very powerful tools in C and giving the fact that you don't even need a handful of variables, they are suited for direct insertion as well as writing functions to wrap them in a more library-alike fashion. Also, they can be used function-style and encourage reuse of temporary variables. I have quite some experience with simplifying code now and think it's the best solution for C I can think of. It also frees me from those annyong extra structs and so on. It makes coding close to hardware more enjoyable and I think this shows when taking a lot at how few informations about the underlying data they need when using them. And I think I'll at first focus on using it on the NXT, this makes my goal a bit more clean compared to just "make it good" or whatever. I'm sick of that, I want a target platform and I think I found mine. Well, would I hold my Pandora right now (which I don't have), I would've started earlier for small target platforms I think. Using macros makes generalization and optimization easier cause you can define define the final output. It's just a finally invisible abstraction layer, the only really valid method of generating code in C.

Oh god, I'm so sick of putting everthing into functions, good that I have macros at hand.


Made an overview

I made a simple implemented/planned/declared/WIP mindmap for ITK, so I can keep track of all the stuff I've done so far what's to do. I think I'm now more certain about what ITK should become and why I won't add certain features to it. I can remember I started coding it to simplify lowlevel features such as memory and string operations, structures for managing and accessing data, vectors and matrix operation simplifications and so on. So far, I've done the memory and the interface for simplifying a dozen of nested for loops for vectors, matrices and array in general is also done. As I mentioned earlier, lists won't find their way in it if they aren't doubly linked ones. That said, almost anything is done for itk and I only need a few more things and I'm done. I'm thinking about to pumping ITK up to a subset of toolkits, all having their own functionality. That means I don't only have ITK for a mix of stuff but MTK for memory operation, STK for string stuff or similar. But then I want to also use GTK for graphics but notice that it's already by another, quite famous, library... So whatever happens, I want to keep seperate sets of libraries. This makes progressing easier and I can focus on a single kind of tasks without switching back and forth like a bobblehead. This makes it possible to implement certain features without thinking about it's necessity if I also have other solution for it... Anyway, I need to find names and seperate them distinctively. Calling something "Imperial Toolkit" is a big. Especially cause a empire implies covering reign and not just a cute lil store chain or so. However, they need to have certain... catch. A certain catch in their name symbolizing their connection. I'll mindmap around a bit and see what comes to my mind. Excitement!
Ngnaahhhh, I feel the venes pulsing in my chest.

Dev-C++ update

Hey dudes, that's something worth to celebrate! Someone made an update for Dev-C++, resolving bugs and adjusting compiler versions and so on. That's awesome cause it was my favorite IDE back in the days when I was learning C and C++ as well! It more some kind of bad C, but who cares. Anway! I gave it a try and really works as I know it from the past. Has a mixed interface, but works fine once you remembered where what is. It fit very well into the kind of software I used before to program: tabbed editor part, list of files and compiler settings as well as code completition. It's more to close the minimum of contemporary IDEs, but I always favoured simplicity and solid arrangement over too fancy doodads here and there to keep the coder away from focussing on his product. I'm a bit tired of sitting on my small Linux laptop all the time, so it's perfect to settle over to something else.

Using this IDE I didn't have any problems coding on Windows I can remember. That's mostly cause it installs it's own MinGW system apart from any system-wide setup. So yeah, that's the perfect case for Windows development: keep it isolated from fragemented trash and other external system threats. Only problem is encoding...


Offline repost #3: The sentinel node

Heyheyheyheyhey----- Wikipedia once again brought some fabulous ideas. My problem with list complexity could get an interesting twist by using sentinel nodes. The idea is to not store seperate pointer for start/end, but providing an "invisible" node that's meant to be the null pointer value. So when inserting an element, there's already an existing node you link can link on both ends with the inserted element. This way it's to insert elements with less overhead but equal memory usage as well as a simpler interface with certain extra features. It's a lovely idea and I shall once again read Wikipedia a bit more when I'm having problems.

Seriously, this is an awesome thing. Everyone should have it. Or atleast more guys like me should read Wikipedia. I just I have to find something interesting and more exciting than my old implementations so that I feel good when writing it. Well, otherwise I'd probably rather play video games or so...

Offline repost #2: Back off, think about something new

I'm still not done with lists and iterators. It must be some kind of pervert fascination I have with them. I just can't find a satisfying way of managing all that stuff while not limiting the user or having to blow up the code. I'm mainly having my problems with integrating rather "exotic" structures like singly lists. Normally, I'd be done with this in a few minutes, but since I want to finalize it as good as possible, I'm having my trouble with it. At first I want of course keep it with as few code lines as necessary. I moved away from inlining in favor of small code size - that will be necessary for the NXT. Second, I also want to keep it minimal in terms of interface design - I always sticked with simple, self-managing items with any need for a bigger list structures. Thus, it's hard to have the same comfort as when doin it direct. Sooo, I just want to squeeze and squeeze and squeeze. In all areas. And here we start with the problem that I'm coding singly and doubly linked lists side by side. I see them as extremely similar, though selectors/iterators are very different in how they performance due to the lack of similar internal structure. Some features require a shitload of unique checks, a few extra variables, special cases and so on. You start writing something and it goes boom later cause exactly this one case won't work but can happen very often. So it's simply not possible to keep them mentioned in a single sentence. Their are so different to control in effective access that I'm thinking about completely dropping them. I've a bunch of cool ideas, but all them are only possible with doubly-linked lists and for singly linked once... Well, let's say I just can't understand why anyone would want to use them except when storing millions of data elements and doesn't have enough memory for using a doubly linked. However, maybe he should think about using something different like an array, cause he'll need to iterate through the singly linked list as much as he would using an array... They simply have no advantage except decreased memory usage.

So no more singly lists. I'm having quite a big set of generic n-ary node collections and such, but there is no actual use for them outside. If I'm ever goin to need some structure that's recursive, I can surely build one if I want. But I don't see the necessity to use a generic singly linked list for that.

Oh and Wikipedia has an interesting article about XOR linked lists.

Offline repost #1: Doomed lists

It's interesting how often I consider certain features more valuable than sheer performance when designing stuff for ITK. For example, linked lists has always been a long story with me and the associated implementations. Especially the inline argument was one the reasons why I previously chose C++ but forgot about when having settled with C. However, I'm still working on perfection the concept and learned a completely different side of iterators that doesn't count on beeing generic but simply more memory-efficient! Yes, that's right. Imagine you have a singly linked list. How do you want to operate on it? An internal selector, so that only one selection is possible and pollutes memory for multiple, non-paralelly accessed lists? No, that's just too much of a waste. So the only really viable solution is to use a special selector providing the necessary selection data, which's atleast three pointers for a parent-bounded singly linked list. All in all it's an iterator. I saw that coming before, but so often I started doing something different. Back to the original topic, designing the parameters of iterators is also quite questionable. I mean did you ever thought about whether to use either two functions for insert/deleting before and after OR using just one and passing a parameter indicating the desired item? I must say I'm currently preferring the latter one cause it simplifies code and works the same for both directions if you carefully arrange all next/prev start/end relations in a two-entry array. Sure, you need to check some more paramaters, but it also gives you a lower code amount (remember, that's also for small devices with less memory) and reduces the amount of corrections you when something went wrong due to only half the code! It's definitely not as performant, but as so often in my later programming experiences, I see a direct connection between small, less fast but efficient and maintainable code and slightly different, generated code with more performance on systems with much memory. I can't help, but this is what my world's about: performance, memory, efficiency and improving the lower layers. All the hidden "secrets" non of my fellow students care about or want (or maybe can) understand. Some reallife company for such things would be lovely.

Oh and just for the notes: singly linked lists are extremely crappy if it comes to O(1) interfaces. I mean you can either provide a unified set for every kind of list (iterator) and insert a plethora of 1->n iterations without the users knowledge or simply only only provide O(1) functionality while imiting the interface. The latter one is my favourite but brings strange things having the ability to insert stuff at start and end, but only remove from the start cause you can't link back from end to it's previous element and so on. Really weird and crappy but faster than otherwise. I'll have to take a look what GCC puts out as assembler when inlining all modules into one single cpp. I could only guess that it's either capable of removing all the parameter checks (would require some flow dependency checks) or doesn't even remotely do something about it. But it's GCC after all, a massive pile of code made to support even the most trivial things when doing it by hand. So removing parameter checks IS a trivial things for certain cases, but maybe not in all that C supports. The more I think about it, the more I understand why it must be easier to write a C optimizer than a C++ optimizer. Or they're converting C++ to C, which I guess is the usual case.


Freeeeeeeee as in "Freeeeeeeee"

Yay, that's what I love for: term break! Finally the freetime I deserve for me and hobbies. The problem with beeing a student of applied informatics, you have to spend the majority of your free time working an silly assignment and such (atleast this is the case for me). Sure, there are free blocks, but I work the best if I can spend whole days on something I'm longing for - games, code, Lego guns, music: all that sort of stuff I have love and passion for. In fact, I don't have many other activities besides watching some movies and series nights long. However, a new episode of NSFW is out, the podcast I'm listening to, and brought a shownote link about another podcast with again linked etc, etc... you know the story. In the end, I started to my beloved Aphex Twin album "Polygon Window" and feel so great just by listening the tracks. It's so absolute in terms of track design that I feel compelled to resume my musicial work. I was almost more that techno/IDM type of music creator (you know, beat patterns are more natural for me than any kind of actual score) with a tad of ambient pad and random acid sequences. I think that reflects me pretty much in my quite comprehensive list of pusblished tracks. I think a restructuring of them can't hurt. And this comes along with a new alias associated with the music I'm doing. "Exocore" is something I created in my previous series of tracks where I fortified the stuff I like to work with. So it's time to reinvent myself under a new name and a bit of new rig. Technically, however, I'll base my work on the KORG synth's I have my licenses for. I wonder whether I can change the USB-type model of keying/license access to something that doesn't require an external medium. Well, let's see what will work better or if it's even changable. In all cases I want to get USB sound card, so I can move my rig to other PCs more easily. I only need a good latency, maybe some other features for EAX in games, but in general I can't deem a non-portable solution as fruitful in my case. I could cost me a bit more money than before, but that's ok - I get a deeper satisfaction when thinking about the possibilities, a must-have for anyone calling himself sane.

About the name to choose, I'm not certain. Atleast no today. But I'll choose something fitting my current inner ramblings.


Damnit, I got so used to that QtCreator while working on my assignment, I can't say how often I tried to press Ctrl+R to test my code in a simple test editor. Also not specifiying those long filenames is very convenient. Plus I hate writing makefiles cause it's like explaining the obvious to a moron.

A lot of new stuff

I was waiting 12 hours in my university just to talk for 6 minutes and get a mark for it. Besides that pathetic fact I somehow had to kill time and somehow managed atleast half of it to become a breed for some from-scratch ideas for ITK. This includes a custom import/export system you can use serialization/deserialization as well as compressing, converting and so on. It's a simple system: there's an interface define an import and an export function, both of them take a mode parameter identifying saying how the data should be exported and what kind of data should be imported. Simple as it is, these modes are: smallest size, fastest reading and fastest writing. That's rather useless on it's own, but when calling for all the small components of bigger structures, ultimately making of exportable objects themself, there's the possibility to completely dump data from RAM into a squeezed format and restoring it later. I solved the problems with pointers and the later changing data addresses by providing a mapper-alike system that should work fine when not overwriting any source address while exporting/importing. All in all I think I got a nice concept down for to solve one of the problems I always had in programming: what's the ultimate way to store data? However, since it'a generic interface and not bound to that address mapping, you can also use to export to XML for example. Everything thing is possible as long as the target is in a reachable memory area, making it necessary to memory map file or other external storages. So yeah, I think I'll go with this concept cause it's the only one I can image that can remove any reduncancy and also work fast when exporting and importing. Yep, that's it I guess and from Monday evening on I'll have all the time of my term break to work on all those ITK features I started during the hot phase of this semester! I've also started working on some more interesting list concepts like rings and segfault-safe linked lists and such stuff. Really interesting things I'd love to realize and see in action. Also, I finished the final format my vector operations by using some macros and rather special macros to replace ever repeated, commong structures inside the code one can combine to very, very complex calculations. Also, I found an an alternative to n nested for-loops for rectangular area tracing. Yes, I can say I'm fully inspired by the freetime I'll have from Monday evening on. Ahhh, the breeze of freedom and anarchy. Not in the political sense of course.


Tiny C

I've stumbled an interesting project called Tiny C which claims to be much faster than GCC, which can only be a good thing (and trusting their compilation time tables yields some impressive speed improvement). But what's really interesting about it is that was originally written in obfuscated C code... I took a look at the original version and geez, amazing how there's a compiler behind. However, it is also an awesomely small compiler you can use if you don't want to install all the GCC stuff or only have a windows system and mingw is too annyoing for you. It has the same command line options and also some compatibility with GCC's own features, so you can start compiling a lot of stuff on you own. But that's the only thing it really gives you. It has fast compilation speed, kann directly execute C code files from scratch and even command line, but doesn't support as much candy as GCC has. Well, you can't get all that nice stuff in such a small package. There are also not many optimization done except constant propagation and some cache twiddles. So what you get is a quite direct code and I think that's what counts at all. It's probably really cool for testing your own code with and so on. I think I'll use it as my main compiler for normal Windows development, so that I don't have to get my head into mingw again. Nothing against mingw, but I just don't like it, no idea why. However, Tiny C is also only a one-pass compiler, meaning there's no abstract intermediate used to apply several optimization tricks, platform-specific commands and so on. Reading about makes me realize how appealing the idea of bytecodes must be for developers. Of course does my programming language, in concept, base on a bytecode, but I'm DIRECTLY translating to the bytecode as every command listed is a bytecode instructions. It's cool to see Tiny C having non of this - small, fast and efficient in the task it's designed for. Just like I love a program in concept. Though too bad that the interesting thin about is also it's most weak spot: no assembler output, no optimizations.

So I had to use goto

Sometimes there are moments where someone comes screaming goto's are evil. Well, technology is never evil, just the evil person using it. But have you ever encountered a situation where the language didn't provide a certain control flow feature, so you actually had to use a goto to get a way round it? C is a great language and I love coding it. But the designers missed some parts that sometimes could be very nice to avoid a set of if's, like when you want to break multiple nested loops for example. I had this just a few minutes ago and remembered that was a goto statement in C which you can only find out by reading code where it's used - mostly kernel code, drivers and so on. Originally I found out about it while reading a gamebody emulator code which also some other things I found rather old-fashioned in idea and simply didn't find any use for it in my code. But it is there and I atleast with this when punching down someone's "goto is evil" screams. It's hard to explain people that certain things are not evil without an exact example. Reminds me of the typical scientist believing that nothing is right until god sucked his dick or he got an idea how to put it into formulas. However, I often had situations where an additional check was required cause the language didn't support direct jumping. From today on I shall consider goto as solution for things I can't formulate in the language's flow control set. Though I'm glad that everything was solvable without goto's... they can still be a tool to form professional garbage creators.


Finally something more substancial

Yay, I discovered that the semester will enable to make an NXT project! Awesome, now it's getting interesting in here. Atleast this is what I found out via out student information system and I hope I can do this with my very own brick and nxtOSEK. It'd be awesome simply use all the things I want to finalize during my term break. Bare C without any Java bullshit inbetween. How can't this be the thing ever?

Interfaces in classic C

I like interfaces. They provide an abstract hull for a concrete implementation, very much a good solution for more complex operations that share exactly the same set of operations but differ in implementation. However, I hate interfaces if it's about memory storages like linked lists, arrays, trees and so on. I strongly dislike the concept of iterators, due to many reason including non-predictable runtime performance if you look at the code without knowing datatype is behind. Algorithms should always take advantage of the type of storage they are designed for, no discussion. However, I like interfaces for stuff that's bears some more abstraction, possibly not differing between RAM and harddisk and stuff like that. Though it's, all in all, all just memory and can be memory-mapped nowadays if it's not coverable via non-mapped addressing... I struggle with the decision of rethinking some elementary ITK principles and waging whether I should keep it purely based on direct addresses or extend it with interfaces. Not using interfaces will force the user to create his own memory mapping, but doesn't blow up the whole toolkit. And using interfaces for data storages... No, this would make me even with normal C++ programming and iterator-alike concepts. If I stick to one concept, I'll drag the it through hell and back, that's it. I believe it's better to rely those common hardware features provided by atleast Unix/Linux system (what I doubt). Interfaces produce quite a mess and the only thing I could deem them fruitful is when it comes to dynamic and automatic use of them, chain operations and so on. For example, my forwarding memory allocation model works like that, you just need need to pass a second interface in case it's not allocatable from where you requested it. I could also leave it to the user to do this, but that's a nasty thing cause you usually just want to get some memory and know whether it is possible, or not. I also gives you no advantage of not using interfaces in this case cause there are so many places you can allocate memory in, that it'd be a mess to always implemented that where you need to allocate stuff. So, passed once it stays the same all the time and you don't need to worry. That's different with iterator interfaces cause the only most efficient algorithm is the one that makes best use of it's storage class - there is no other possibility, even the most optimized, abstract algorithm has it's caveats depending on what kind of memory you're operating on. However, that's a tad of philosophy I have to admit, but that's driving a programmer's nuts, isn't it? Well, atleast I know that better results come out when a programmer realizes that.

Hm I guess I sorted interfaces out this time. Sure, all ITK modules have interfaces, too. But that's on a library level, not a in-program function level using structures and so on. I also created some kind of "roadmap" where I'm keeping track of all the modules and ideas I want to realize as well what's finished and what I'm working on. Some things are finished but not yet tested due to my manic type-without-testing approach I sometimes doing almost perfect with no compilation errors or bugs. Especially in my case, it proved fast and safe in development cause I can hack in all my ideas and implementation details with not fear of bugs or annoying error outputs and then systematically debug with no interruption until it works as expected. Maybe I just found the perfect programming rhythm for my kind of person. But I can't deny that the process itself (header first, implementation following, debugging loop) is heavily interface-based cause, as I mentioned above, all modules have interfaces, too. Even a simple function prototyp for a single cpp file is an interface, show exceptionally minimal. It's, however, good to know that I can still wrap all my headers into component-like access structures. The last assignment I'll have to present on Friday, where I had to complete program a component platform just to get the mark using Qt/C++, made me realize how actually interesting the concept of dynamically loading binaries is. I also noticed that before, but I didn't do anything to test it for kicks. Now that I simply know how easy it is and how more easy it is with conventional C (atleast for Linux developers), I think it's a good idea to provide some wrapper functionality for it, later. Wrapper are your best friends as long as they provide only on additional layer between you and your victim.


Some signs of life

I wasn't that active the last time and noticed a few posts I saved someday on my harddisk, but never actually posted them here. However, I want to change that in the feature and will try to keep my thoughts as fresh and immediate as possible and not just dumping them somewhere like used condoms or porn magazines (well, some people might extend that to partners or people in general, which's a quite unnice thing to say the least). However, one of the things that came to my mind quite surprisingly is the fact that so many libraries have some kind of error retrieval in case something happened, but most of them don't keep those consistent or even covering enough. I prefer not storing any error due the necessacity of static or atleast some temporary variables. But I can't deny that sometimes you need store a list of errors that happened while a function does it's work. Youd store in an internal list or last error variables or so - something with which the library user can get in the know of what's going on without having to parse string messages meant for the console or so. When writing OOP classes for Uni assignments I prefer having either the approach that users should archieve them on there own or that there's some kind of error storage like a list or so they can export to an array or whatever if something bad happened. It's a simple and clean way to not forget anything that might've come across your problem. But have you ever thought about how to generalize that? There are exceptions, yes, but they are nasty cause they alter the program flow directly and don't store any information about other exceptions thrown before them. Personally, all my design are based on the assumption that there are not many errors preventing a function from doing what it was supposed to. It gives back what it was able to do, so that the problem has the possiblity to repeat it again with different settings or data. This increases autonomy of program parts and may guarant completely new and interesting problems while to continue developing. It motivates programmers (atleast me) to design code that is able to check, cope and live with errors, making the best out of it as well as atleast trying to archieve a part of the stuff you it is supposed to do in case of error. Back to the topic of storing errors itself, I noticed an advantage over classic procedural design: you can create a generic error-handling class that's implemented and documented once - not more, not less, you never have to care about it again. It would have all kinds of really nice functionalites like directly printing them, converting them to arrays, lists, program-readable code data as well as human-readable console text or so. It's one of the few think I'd deem OOP useful for: generalizing the stuff you have to do all the time, not to write getters, setters and other completely interfaces and inheritance trees. However, This is also why I think about not only continuing to write itk as a big set of C functions, but also in a rather custom, interface-based format using structs and function pointers to guarant such comfort for iterating errors. But no, that would create a mess of documentation, useless pointer structures and serious maintainability problems cause you'd have to also setup those function pointers... No, this isn't a good idea and I probably go better to leaving it to the classic approach of true/false return values. For more complex scenarios in e.g. complete frameworks or a game engine, I guess I'll simply stick to a structure for storing errors the user as well as the functions can manipulate. That ensures high performance on lower levels and comfort one higher ones. I'm again creeped by how few my fellow students do to prevent errors. It's all just "just get it done" and not "get it done but also do it well". Quickness is for sure always a vital point for development, but that doesn't mean it's a good idea to also sacrifice quality for even the smallest possibe speed advantage. Better try to grasp the big trade-off chunks before you start and think twice before you do it. Quality is still something I deem most important for software. Personally, I'd rather wait a few hours more, as a customer, to get a better product than a half-hearted speed developement. Look at PC games and operating systems. Most of the time you can't even try them out for a while cause they crash or dick otherwise. You paid hard money for it and it's not getting back until the dev's melon head can come around with a better solutions. Do you really want something like that? No. So better replace your urging manager with a better advertizer to whet the people's appetite in the beginning or even to motivate them paying BEFORE the software comes out. Preordering is a common thing today, to better stack some more months on top and get a bit more money to accelerate it.


IDE vs. editor (once again)

You know that feeling when you're editing a piece of code inside a stand-alone editor and someone else comes around, looking over you shoulder and saying that he doesn't understand how one can program without an IDE? This happens to me quite often and I shouldn't be surprised: almost no student around me has programmed using actually minimal tools. And if I think about it... I learned to code in BASIC before. It's common among not-so popular BASIC dialects to only have a multifile editor with build support for the current file where the only project-alike structures come from including other files or specifying build options inside the code (I always like that approach for universality reasons). So it's typical that I prefer a simple scratchpad that's very basic (surprise, a pun) but still fully functional. I simply can't explain it to the people asking for my code editing preferences. They use code completition, automatic bracket insertion, code formatters and error checks while typing. But the only feature I really use is syntax highlighting and (again very basic) automatic indention. I think because I used those simple editors for a very long time, I got more than used to it and found joy in it's simplicity. There are so many IDEs and they have so many functions which's use I was able to appreciate. So when you don't any of the features it has, it only becomes a slow ressource-draining piece of monstrous software - that is probably because I don't like IDEs that much. One exception I found over time is Geany, a very lightweight IDE you can use for few files or projects with more than two or three files. Since I experienced QtCreator for a while now, I once again appreciate the central convenience of source management, but this doesn't pay off when working with more complex file and project structures that depend on each other. Maybe the software isn't as finished as the version number tells, but all the stuff it offers isn't necessary with a more elegant and simple API. However, my expertice isn't the work with big toolkits but bare on a system or even directly on a machine. This is where my approach of editing is more useful than with IDEs: I can work nothing but text if it comes to that - do you? And don't say this doesn't matter today because it actually does. It doesn't shine through all the glitter and glammer of today's software development habits. But if it fails, everything else failes, too. That's why rising nations fall hard if they forget about the small things. A matter of fact you can't deny all the time - especially not if it's standing on you and pointing it's sword at precious parts you value the most.

Component-based Development

I'm quite 'round quite a few strange concepts from time to time, but most of them are just candy. Candy for those project managers and wannabe software "designers" with emphasis on design for presentations and papers. Old bricks painted in new metallic colors but still providing a wooden core like always. Though... they use rather old woold not up-to-date but put their new labels on it. However! During all those lectures about component-based development, the lecturer didn't even mention what it actually in detail is - not even in a sidenode! What a pity that another one wanted use to write an assignment using it. I didn't care about until a few days ago and since I had a lengthy discussion with that lecturer, I noticed how actually much I have to do to actually call the project we're not really working on anymore. I really coded him something that's, on a binary level, exactly what he wanted. But you know what? He wants design. Yes, design. "Component design". As if there isn't more useful stuff on the planet. Components and it's related development "form" is only a new, fancy and object-oriented term for the good old dynamic libraries. Really, it's that simple. Everyone saying something else does either see it different from how I do (on a plain and direct binary level) or does again see only magic and fairy dust. However, those components have be reusable and generic. How funny. Millions of C programmers did that in the classic manner before. Now it's so different cause it's still the same but with more glitter and sparkling light effects.

Stepping aside from my usual rants about those marketed approaches, I had to "invent" some kind of component platform to fulfill the lecturers desire for more componenty "design"s. We are using Qt as our framework for database access and GUI control. Qt itself does not provide any component platform support, but you can load dynamic libraries and Qt plugins to your liking. At first I simply load two singletons, connected them with quite a lengthy main code and that was it. Too bad that wasn't a component to him. So what I'm doing currently is to double to amount of code and essentially insert it twice to give each component the same abilities like the other, only with some different operations... pathetic. He's so in his "omfg those components HAVE to be reusable any time!!1", that he simply forgets that it would be totally useless in our case to it the way he wants... but whatever, that's the source of marks I have right now... shouldn't screw it cause I know it better. All in all, the only interesting part about is coding the component platform itself. I made a really miscroscopic platform that works by user-based insertion of libraries and then accessing the singletons them using string identifiers. There's even support for component depencies and so on - I think this, along with some more "reusable" (oh how I hate this term when using it like that) component functions that be configured alone, by using Qt objects oder by guessing other component's settings should really be enough... I mean what does he aim at. I can code him whatever he wants in a few hour (the component server didn't took me more to finalize) in less time required to design his stupid components. The secret doesn't like behind time-wasting development but in knowledge and ability. They should rather teach their students how to think and design like a computer, translating reallife problems and questions into a computer language of their choice. Yes, that may sound quite generic, but this as well as knowledge about technology and methodology is where it really matters. Most students I know can create a JSP webpage but don't know enough about Java/Javascript and it's abuse to get the webpages performant or even complex enough in function. I still believe it's more fruitful in result to teach them how to make stuff good and not how to another stupid marketing gag. One could say that companies see it different. That they want to train them on their own but only having them able to operate tools - not more until some of them have good ideas or so. Sad world, sad world.

Somehow I thought about to release that kind of platform system to dynamically load and set interfaced libraries in C. I still have to create something like that for my programming language, so it's a good first practice to do so. Of course, it's Qt. And of course, C has to iterate it on it's own, with less feedback and comfort. But it's all in all more direct and make you able to see how few things it actually consists of. Simply loading libraries, not more! So easy, so simple, so nothing like stupid models or so. Components are for whimps, use dynamic libraries.


What the fuck

It suddenly makes click, you got the greatest piston elevator model ever build in Minecraft and see that there's another fucker on Youtube having exactly construct. What a waste of time.



I always wondered how Linux desktops were able to share filetype informations without converting between their own formats. So I remembered that was something MIME to uniquely identify filetypes using strings with a very simple notation. Not sure whether they actually use it internally, I like it's concept and that everybody can send in their own type specifiations on request. It's truely a worthy thing to do: rely on it to detect file types and you can handle them later using this type information. Though currently only a thing that's not deeply rooted in my coding interest, I like the idea of passing file stream and then getting getting back a simple string or int type identifying the file type. A lovely idea! Really. And also a lovely idea to implement it for ITK! But it will be a quite repetitive work implement all the file header readers. Well, atleast I only need to read the headers at all. This should keep code small to a minimum. Gnaahh, my inner nerd is getting by that idea of getting the actual filetype with a single command. Though due to heavy comparison work quite slow and massive in the amount of required code, I'll simply included it as a non-yet implemented header file in ITK. I have to say this is in general a comftable way of implementing features: write a header with all structures and prototypes you want to give and implement it later when you have the time or just give it to someone else working in your team. Effectively the lowlevel variant of class interfaces they all praise so much by not understanding that all this stuff was already there before they really started with it. Whatever...

strings and chars

The more I think about string formatting, the more I also think about what would happen if I choose to create a game engine that uses special characters in strings to, for example, print more than just only text and maybe totally different symbols, even icons. In fact, char's flexible size doesn't make it suitable for that. But the question is rather whether one should actually do something like that! Some of my earlier programming experiences included such system where I excluded different characters to print smilies and other sorts of logos or icons inside the the text. It was no problem for me cause I never used those characters. But what about guys using them all day? What about french guys using '´' and '`'? What would they encounter when using them? Well, probably they'd just not use my engine! And that's why I'm thinking about it more than about anything else right now. Currently, I decided to leave all common ASCII rules aside and let the user define a custom table for whatever he wants to print. This is slower than taking standard ASCII values, but ensures that it's possible for everyone to choose whatever symbols he wants or needs - automatic mapping. I got that idea when designing integer conversion routines: design a function for converting to hex and one for decimal, though the algorithms behind ar EXACTLY the same? No, thanks. When writing code in general, I often encounter situations where I'm forced to either copy a single functions dozen times and alter only one line or inventing a rather generic solution. Currently, I prefer the generic solution. Though for performance reason I prefer to copypasta memory-related code. One could that converting numbers to strings is as important as copying memory, but I disagree here, cause for what I'm used to, it makes more sense to rely on a purely digital and print it on demand if dumping out numbers is performance-critical point. Going back to the topic with this in mind, it makes sense to not rely on special characters in strings but add a custom xml-formatting to more complex texts and layouts later. I really like XML and fell in love with it while learning about HTML code back in school. Looking at it's design, it's really a totally primitive but effective solution to the typical problem of coding human-readable data. I've never found anything more nice for simple, mostly cause everything else tries to copy programming languages or integrate into them (for example: JSON). However, covering the whole XML-standard is as annoying as anything else in the world of web formats (gladly, I won't have anything to do with it). There's even a notation for validating XML documents - inside those documents... Quite useless for simple applications if you ask me. Almost no non-web program validates them this way as far as I know. And why? Possibly for a number of reasons including that it's easier to directly check it inside the program with a simple XML tree scanner rather than using a complicated notation not as flexible as normal code. However, there are also more web standards which can be of use for normal programs. URL notations for example (or URI, don't exactly know what this one was). I'm always annoying be how inconsistent most systems handle paths and such. On a typical destop system, we should make more use of those standardized and system-independent notations. Knowing your enemies gives you great portability advantage.


The more I think about more or less completely replacing the standard C library with my own set of stuff, the more I notice how awesome the stack compared to the heap is. Seriously, it's the best temporary dump ever! A few months ago I wouldn't have been able to realize that: using C++ and it's new quite often, always allocating around, just to get a simple temporary array done. I don't know why, but I think it's simply because I was too much into that OOP stuff to realize how bad it actually is. Or did I realize it but didn't want to have it real? Well, I don't know exactly. However, I haven't thought about more clever solutions to automatic memory mangement - the stack IS such a clever solution. Simply go up and down, use a relative position and we can do anything we want. The is no real difference to the heap except that it's managed automatically - so why bother with the heap all the time? Thinking about that, I start to see even more beauty in C's core concepts and how it will be translated to assembler code. So much stuff is done on the heap today, even for the smallest, highly temporary operations! What a waste, I say you. We are NOT in Java, we are in the world of conceptually viable programming languages. Therefore, I can only distrust C++ more than before. Dynamic memory is still an operating system problem today - not if you do your own management system of course. I started to redesign my own programming language to be more sensual about that matter and think it is the best to get a bit away from my initial concept, but still having every command as a bytecode. I'll try to get a symbiose between comftable highlevel BASIC programming-like simplicity and direct stack/heap relations like from C or Assembler. Most programs out there don't even NEED any dynamic memory. But still, they use it all the time and don't think about what might happen if they drop all the shit and go classic. It's totally possible to do custom dynamic allocation: just allocate a bigger memory and manage it on your own. Of course, it's easier to just call new/malloc, cause they do exactly the same but within a greate scale of course. Memory management is WAR, totally. Like it is with CPU time and other resources. Therefore, the less you have to parcipate in such kinds of brawls, the luckier you'll go out! Simple. That said, I can now only feel joy about how simple I can design my functions, the overall interface. I like clean and efficient code and I simply guess the prejudice found around OOP programmers that C is old shit and results in spaghetti, is stupid. I on the other side know the differences and can clearly say that both have very, very different assumptions about what good program design is. Instead of saying anything bad about C, I'd rather Java programmers spaghetti coders cause if something turns out wrong cause their models are too fuzzy to be understandable, they don't get what it is and start adding more stupid frames around it. Similar to how Windows grew weirder every now and then, those guys tend to not get it.

That's the problem with OOP: you can teach them how to use it and how to design the spaghetti models, but you won't be able to teach them it's underlying system with only that knowledge. If they try to enter lower levels, they'll stumble and rather hurt themselfs instead. But well, we always need pawns to sacrifice and triumph.


The price for working with morons

I've never really worked on complete application/GUI skin on whatever operating system, but for one of assignments I somehow had so put some kind of multimedia stuff on top cause the project my team had chosen was kinda non-multimedia and never will. However, I was somehow the only one knowing that the project is totally shit on it's own and that the other one of two lecturers would probably give us the worst mark possibly imaginable if we don't atleast include something shiny or colorful. So I sat down, channeling the only creative energy in our team right out of my head and simply picked up our university's colors (green, blue and orange with a tad of grey - looks better than it sounds) and based my complete design on it. A very simple but smooth and nice to look at theme. I've been working on it for a few days and atleast my friends gave me good feedback about how it looks. I'm nobody wanting massive applause or whatever, just an ok whether it's good or bad. And then I'm popping up, showing it to a team member - guess what happened? "But this doesn't look like out imaginary CICD!". What. In. Hell. And then I remembered that one of them designed one for an older task long gone. Seriously, it look like total bullshit and didn't even have ANY design rules or concepts applied. It was simply bad and not worth to talk about it anymore. I spend complete days from one morning to another and what do they say? Those stupid idiots. Pretending they know even one thing about design but can't even apply the most basic concepts.

Oh god, how I hate this. One day working with professionals! Just one day! Geez, next time they can do it on their own and I will make my own project with the help from others. I won't let anybody do the stuff I can do best of them. And again, the true rating part lies on my shoulders and I have to wait for those morons finally sending me the last bits of their bloody code. Actually, there was only one really coding as well as me for the essential parts. Another one point-and-clicked his way through a GUI builder (though his interface wasn't totally crap) and the last member... Well, let's just say she doesn't have anything to do right now at the moment. I feel like somebody trying to run against a storm while carrying 100 pounds of additional weight attached to each of my very own toes.


type IDs

I found some time to design around in ITK and added a primitive but convenient type ID system made to wrap existing data with type IDs to handle input and parameters data more dynamically. I didn't let it be limited to build types or so, it's open to custom types as well. Actually not much to talk about, but I decided to leave as the "ultimate" comfort option, where you can program as comftable as in any other language or with a set of highly overloaded functions. But the cool thing is that my one is dynamic. This means you can do something that would mean more work if you'd have to solve it otherwise. Well, most of the time you can't handle dynamic input otherwise. You need a type informations, it's always the same principle. One really nice thing you can do with the functions already supporting it is to create arrays on the stack and fill them table-like without any necessary functions calls or data fillers. You can directly edit the raw data if you want - quick and simple, exactly like I want. I'm looking forward to all the nice things I'll be able to do using it. I know I need lowlevel for specific cases, but I also know that dynamic type system comfort can, in rare occasions, be as efficient as fixed type operations. Of course not as fast, but solving the problem safe and elegant without the bug with too complex program structures. I think this is a good path: lowlevel as a base, comftable enhancements using type ids and special datatypes made for tests and early development comfort.