7.09.2011

strings and chars

The more I think about string formatting, the more I also think about what would happen if I choose to create a game engine that uses special characters in strings to, for example, print more than just only text and maybe totally different symbols, even icons. In fact, char's flexible size doesn't make it suitable for that. But the question is rather whether one should actually do something like that! Some of my earlier programming experiences included such system where I excluded different characters to print smilies and other sorts of logos or icons inside the the text. It was no problem for me cause I never used those characters. But what about guys using them all day? What about french guys using '´' and '`'? What would they encounter when using them? Well, probably they'd just not use my engine! And that's why I'm thinking about it more than about anything else right now. Currently, I decided to leave all common ASCII rules aside and let the user define a custom table for whatever he wants to print. This is slower than taking standard ASCII values, but ensures that it's possible for everyone to choose whatever symbols he wants or needs - automatic mapping. I got that idea when designing integer conversion routines: design a function for converting to hex and one for decimal, though the algorithms behind ar EXACTLY the same? No, thanks. When writing code in general, I often encounter situations where I'm forced to either copy a single functions dozen times and alter only one line or inventing a rather generic solution. Currently, I prefer the generic solution. Though for performance reason I prefer to copypasta memory-related code. One could that converting numbers to strings is as important as copying memory, but I disagree here, cause for what I'm used to, it makes more sense to rely on a purely digital and print it on demand if dumping out numbers is performance-critical point. Going back to the topic with this in mind, it makes sense to not rely on special characters in strings but add a custom xml-formatting to more complex texts and layouts later. I really like XML and fell in love with it while learning about HTML code back in school. Looking at it's design, it's really a totally primitive but effective solution to the typical problem of coding human-readable data. I've never found anything more nice for simple, mostly cause everything else tries to copy programming languages or integrate into them (for example: JSON). However, covering the whole XML-standard is as annoying as anything else in the world of web formats (gladly, I won't have anything to do with it). There's even a notation for validating XML documents - inside those documents... Quite useless for simple applications if you ask me. Almost no non-web program validates them this way as far as I know. And why? Possibly for a number of reasons including that it's easier to directly check it inside the program with a simple XML tree scanner rather than using a complicated notation not as flexible as normal code. However, there are also more web standards which can be of use for normal programs. URL notations for example (or URI, don't exactly know what this one was). I'm always annoying be how inconsistent most systems handle paths and such. On a typical destop system, we should make more use of those standardized and system-independent notations. Knowing your enemies gives you great portability advantage.

No comments: