IDE vs... no IDE

Studying stuff that involves programming and software development does usually bring up a set of circumstance we all solve differently. A very basic point is to use an IDE or not. I learned to program inside an IDE cause it's simpler to grasp - it's one application you use to code, compile and test you program. Nothing else. You don't need to worry about the single steps between and there is nothing much between you and your project except a learning curve and the integration of additionaly libraries and/or frameworks. Most of the time in my life I only knew the compiler as a single component getting called by the IDE and nothing. Command-line compilation was for hackers and people who are "pro's" or other kinds of elitist-sounding dudes/dudettes. But over time, the IDEs I used changed like with every programming language. I can remember that I use around 9 different IDEs for 5 different languages. Even during my very first semester I was still using Code::Blocks to create my two C assignments. I only used a normal editor during the computer laboratory tutorials and to show off that it works so I could get my mark (the best one, obviously). I always thought it can't get worse than using a command line, except with Visual Studio. And this didn't change until I had to code Java. We had some IDEs to choose from - the lecturer used Eclipse (he is an IBM employee, this no doubt he used it extensively), but didn't forced us to create our projects with it. So then there was also NetBeans. I didn't like them. I already tried Eclipse before and quickly marked it as bloated and more difficult to customize than necessary. NetBeans, hm. What is it? I heard of it but never saw a necessity to install and test it. Sounded like a project I should come in touch in touch with. WTF is Java? Well, let's say I wasn't convinced to use it either. I only had a bunch of weird allrounder IDEs with complicated setup and configuration and a simple text editor along with a terminal. So my choice was quite logical - you know gedit and you know how to compile on the command line. It is already there and doesn't any deep-thought configuration. Learning Java or libraries function without a auto-completition wasn't as simple as I thought. I had to picture every aspect and architecture in mind to use it. No fancy project management, no automagically setup class pathes and so. Some things were weird, but blended well with my thoughts about Java's inner architecture. It was necessary to learn a bit more than others who used Eclipse, but it paid off - I didn't need to find a way around this huge monster of an development environment. I could start right off the box. No fancy stuff was necessary, plus you could essentially develop it EVERYWHERE. gedit is on every PC we're using at the computer laboratories. Terminal, of course, too. So I learned Java and along with Java's really annoying and kinda inconsistent libraries functions also some really cool concepts I liked. And I think this positive experience with a bigger program of mine completely done without any IDE and some insights made me realize how useful it was. And especially how well it worked to not have so many gadgets to "ease" you developers life. Later I also coded ActionScript this way and every other piece of code I had to create, too. I freezed my ASCII renderer and got this really stressful BIGJam "weekend" I'll never forget (in a sardonic way). The rest is clear - found no good editors, switched distributions a lot and ended up with Gnome since it worked and was familiar.

This combined with my PC going insanely slow at times, forced me to think about the process of coding and also of compiling and testing. I did numerous posts here about editors and my hopes to finally get a "lofi" IDE working, but it never worked and I completely switched to the tools our first lecturer recommended us- a text editor, make and a command line. We all used this simple this simple setup in the very first lectures and it worked. The only difference was that I went from IDE to terminal and they in from terminal to IDE. Could be 'cause most of them didn't have the IDE experience I had. Or just because I always appreciated the most direct way of understanding - start from the most essential parts you don't already know and continue to further derivation. It's indirect/complicated (IDE) vs direct/simple (terminal). Or Zen power vs forced productivity boost. I was always used to having a class/function browser, auto completition, custom indentation etc... I never realised how useless they are if you actually want to learn something in a whole until I was forced to experience it. Of course, I like if it's available, but I just don't use it consciously. Rather to see if stuff's getting more or if the word I typed does exist or so (most gedit plugins aren't even tailored towards symbol parsing but base on word databases composed of all opened documents). So, in what did it turn out? Me using a pimped text editor and a terminal for everything programming-related. Just cause everything else sucks.

The process of insight profits from getting to the level of what you want to get insight from. In case of programming languages, it's more teaching to see you code as a loose set of files. When using make, you should see your files as a requirement tree of time stamps and associated commands. And when using an IDE... well, I don't quite if I actually like to say this: When using an IDE, you should see your code as a program, not just a set of code files.

I don't like this view. It conflicts with my elementary precept of reusing everything returing and generalizable. I'm definitely a dedicated generic programmer and I tend to see redundancy everywhere. So often that I wasn't yet able to make my game real. But no, that's wrong. I knew I can't satisfy my personal set quality standards with continuing the usual way of just features. I knew there is so much in graphics and games programming that is simply redundant to no end - especially alghorithms to track, set and analyse data grids or lists and tree structures. That's why I'm generalizing until it's done. The best idea of technology doesn't simplify development if you always have to code it again. And using a precoded framework requires having always the latest version with the newest and most specialized features. A generalized structure doesn't change at all if there's new stuff to create - just design carefully and you will never have to code it again, you only specialize it. These are kinda rules and applies to most code patterns. Comparing this to my view of an IDE project, seeing all files as a single program, conflicts with my love for generalized code. I don't only see a single project full of unique files using an engine or so. I see a subset of a bigger process base tailored to meet specific needs specified by a project. I haven't yet found a way to combine the concept of IDEs and my workflow. Maybe I'll never do. And well - do I have to? I don't know. If I'll ever want to work in a small to medium-sized software project, they'll probably use a specific IDE or a system that works with every IDE understanding makefiles or atleast a command line. I doubt I'll find it out before I can get my hands dirty on a real team project. I see a software project as specialization of a generalization. And this will help me get the relation between a project and my code. But I don't like to an IDE for that. An IDE maybe possible if there's a massive framework behind you can use for almost everything (like in Java), but for so many lots of templated C++ code? I don't know. I don't like the idea.

No comments: