According to Joel Spolsky in his article on hiring programmers, the best programmers are usually "never on the job market" because they get snapped up on an internship and get a permanent position. He says that "there are people out there who appear to be applying to every job on Monster.com" which essentially apply for everything they see.
Which is pretty much what I did when I graduated. I started off at first just applying for some internships or jobs at companies that I would have liked to work at (and were in Montreal, that being where I wanted to live) such as Google or Ubisoft. They had their summer of code things, or other internships that I unfortunately didn't really know about until my third year of university, after I had a research position which somewhat ruled out a summer internship and didn't benefit my resume at all. I didn't really hear any response from these companies so I started applying for some other jobs that I might be interested in. From the ones who did respond, it was the whole "we're looking for someone with experience" even though it directly stated on my resume that I was in my last year of university. How am I supposed to get professional experience while in university? Anyway I ended up going to Monster and just applying for anything remotely interesting. That actually got me interviews, however the one job that I actually did end up getting was through a friend of a friend, so it shows how effective Monster.com is for finding a job.
Maybe I'm just a shitty programmer, or am one of those people who think they know what they're doing but actually have no idea. I'm hoping this is not the case. Basically at this point I feel like I've missed the boat on applying for one of these great summer internships I'm finding, and am stuck working at mediocre jobs that don't allow me enough time to hack around on interesting things (except maybe a rare exception).
Feb 27, 2008
Feb 26, 2008
Mandatory vs. Optional Indentation
So here is a topic for great argument. Should indentation in a programming language be mandatory (à la Python, Haskell) or optional (à la C++, Java, etc.)?
Personally I've always been opposed to mandatory indentation, but my arguments against it are rather weak. I read an article praising mandatory indentation (the guy is a teacher so I can understand where he's coming from, having marked programming courses) and his arguments are a heck of a lot better than mine. He even hits it right on the nose as to why I don't like it: "Anytime somebody tries to impose rules that they follow 99% of the time anyway, they always focus on the 1% exceptions."
So I'm starting to think maybe mandatory indentation is not so bad. It would help make other people's code easier to modify, if not only for getting rid of retarded indentation schemes (see ClipShare) and getting rid of huge lists of closing curly brackets or end keywords, which are a pain in the ass anyway. However, I think enforcing this type of thing on the programming community would be as difficult as enforcing the := operator as the assignment operator to avoid the if (x = true) annoyance.
Personally I've always been opposed to mandatory indentation, but my arguments against it are rather weak. I read an article praising mandatory indentation (the guy is a teacher so I can understand where he's coming from, having marked programming courses) and his arguments are a heck of a lot better than mine. He even hits it right on the nose as to why I don't like it: "Anytime somebody tries to impose rules that they follow 99% of the time anyway, they always focus on the 1% exceptions."
So I'm starting to think maybe mandatory indentation is not so bad. It would help make other people's code easier to modify, if not only for getting rid of retarded indentation schemes (see ClipShare) and getting rid of huge lists of closing curly brackets or end keywords, which are a pain in the ass anyway. However, I think enforcing this type of thing on the programming community would be as difficult as enforcing the := operator as the assignment operator to avoid the if (x = true) annoyance.
Feb 22, 2008
On Microsoft's specifications release
About friggin' time.
This is something that will definitely benefit the community. It will help sway people on the fence about GNU/GPL stuff such as myself to warm up a bit to the nasty software giant.
But will it benefit Microsoft? Joel Spolsky states that the Office formats are overly complicated and that anybody attempting to clone the software definitely has their work cut out for them. It is a valid point, however the open-source community is capable of doing very difficult things, and this is an excellent opportunity to make them not suck so much. This means that Microsoft will have some competition, something they probably don't want.
It will also benefit Microsoft. Releasing their specifications will mean that more people will develop software using their formats, further spreading the Microsoft platforms. People will develop alternative software to Microsoft's versions, meaning that the quality of Windows applications will in general improve, which may help to stem the growth of Mac and Linux.
I guess we'll just wait and see.
This is something that will definitely benefit the community. It will help sway people on the fence about GNU/GPL stuff such as myself to warm up a bit to the nasty software giant.
But will it benefit Microsoft? Joel Spolsky states that the Office formats are overly complicated and that anybody attempting to clone the software definitely has their work cut out for them. It is a valid point, however the open-source community is capable of doing very difficult things, and this is an excellent opportunity to make them not suck so much. This means that Microsoft will have some competition, something they probably don't want.
It will also benefit Microsoft. Releasing their specifications will mean that more people will develop software using their formats, further spreading the Microsoft platforms. People will develop alternative software to Microsoft's versions, meaning that the quality of Windows applications will in general improve, which may help to stem the growth of Mac and Linux.
I guess we'll just wait and see.
Feb 21, 2008
A ClipShare Review
A few months ago, I was working with this software called ClipShare, which is a PHP video-sharing software like YouTube. Essentially it lets you create a YouTube clone.
This software is horrible. It is poorly coded, insecure, unmaintainable and unscalable. It is in PHP4 (EDIT: PHP4 is now considered legacy software, and is no longer supported by Zend, the company who makes PHP) and does not take advantage of any of the "object-oriented" features of the language. If you're looking for a software that you can just dump in and ignore, then ClipShare MIGHT be a solution, until somebody injects SQL somewhere or your site actually grows. It uses $_REQUEST everywhere, including in SQL strings, which means somebody could put SQL in a cookie and have it wreck your DB. It uses Smarty templates, which are designed to separate the view from the application code. Ta da, it is a bare-bones MVC architecture, minus the M part and you have to learn a new syntax/language to program the views. It also doesn't interface very well with PHP, you can only call functions and so ClipShare gives us a gigantic 3000+ line function.php file. Ever heard of modularization?
ClipShare just dumps all the files of one type in one place. Suppose you have 12 images per video, which is 4 thumbs for an animation and 3 different sizes, then you get 1000 videos on your site. That's 12 000 thumbs in one folder, which means your filesystem is going to be running crazy slow. On top of that, it must synchronize files between servers, which takes even longer. Triple the number of videos to 3000 and you've got 36 000 files in one folder. This isn't actually possible with ext3 (the standard Linux filesystem) unless you recompile your kernel, since the maximum is something like 32 000 and so thumbs will just stop being created. Now you have to re-arrange your file structure and find every place in your code that uses thumbs - since ClipShare doesn't centralize these sorts of things, it's all over the place - and organize it so that everything isn't in the same folder. Lot's of work, and time well wasted.
At this point we have completely recoded the site and cleaned out anything that was there from the ClipShare code. The only thing left is the names of variables and database tables, which are retarded (who calls a users table signup?). Unfortunately we might have to do another recode, since the server layout won't really scale very well. If you're looking to do a YouTube clone, you're better off getting a bunch of good programmers together and coding your own site from scratch.
UPDATE: Since I'm not working on it anymore, I feel it's safe to say that the site I was working on was Pornhub.com. And I can assure you (despite the fact that they still use a view_video.php for SEO reasons) that there is not a scrap of ClipShare code on that site anymore.
ANOTHER UPDATE: For those who care, I wrote a post describing to some degree how we recoded our site.
ANOTHER UPDATE: I'm not interested in coding another video sharing app, sorry.
This software is horrible. It is poorly coded, insecure, unmaintainable and unscalable. It is in PHP4 (EDIT: PHP4 is now considered legacy software, and is no longer supported by Zend, the company who makes PHP) and does not take advantage of any of the "object-oriented" features of the language. If you're looking for a software that you can just dump in and ignore, then ClipShare MIGHT be a solution, until somebody injects SQL somewhere or your site actually grows. It uses $_REQUEST everywhere, including in SQL strings, which means somebody could put SQL in a cookie and have it wreck your DB. It uses Smarty templates, which are designed to separate the view from the application code. Ta da, it is a bare-bones MVC architecture, minus the M part and you have to learn a new syntax/language to program the views. It also doesn't interface very well with PHP, you can only call functions and so ClipShare gives us a gigantic 3000+ line function.php file. Ever heard of modularization?
ClipShare just dumps all the files of one type in one place. Suppose you have 12 images per video, which is 4 thumbs for an animation and 3 different sizes, then you get 1000 videos on your site. That's 12 000 thumbs in one folder, which means your filesystem is going to be running crazy slow. On top of that, it must synchronize files between servers, which takes even longer. Triple the number of videos to 3000 and you've got 36 000 files in one folder. This isn't actually possible with ext3 (the standard Linux filesystem) unless you recompile your kernel, since the maximum is something like 32 000 and so thumbs will just stop being created. Now you have to re-arrange your file structure and find every place in your code that uses thumbs - since ClipShare doesn't centralize these sorts of things, it's all over the place - and organize it so that everything isn't in the same folder. Lot's of work, and time well wasted.
At this point we have completely recoded the site and cleaned out anything that was there from the ClipShare code. The only thing left is the names of variables and database tables, which are retarded (who calls a users table signup?). Unfortunately we might have to do another recode, since the server layout won't really scale very well. If you're looking to do a YouTube clone, you're better off getting a bunch of good programmers together and coding your own site from scratch.
UPDATE: Since I'm not working on it anymore, I feel it's safe to say that the site I was working on was Pornhub.com. And I can assure you (despite the fact that they still use a view_video.php for SEO reasons) that there is not a scrap of ClipShare code on that site anymore.
ANOTHER UPDATE: For those who care, I wrote a post describing to some degree how we recoded our site.
ANOTHER UPDATE: I'm not interested in coding another video sharing app, sorry.
Feb 20, 2008
The Effect of Impressions
Everybody knows that the computer world moves really fast. Technologies update quickly, prices drop quickly, and young upstarts like Google can easily take on old farts like Microsoft. When you move into open-source and Ubuntu, things move even quicker (sometimes). Ubuntu has a release schedule of 6 months at the most, compare this to Microsoft which as of late is 5 years and in their earlier days, 2-3 years; or Mac OS X, which is usually about 1-2 years.
While it's nice that updates come so quickly, they do not do so much to get rid of the impressions the older bugs make. From my experience, when people try out a new software and it doesn't work or there is something wrong with it, they will assume that the problem will stay there for a long time. If there was a bug with Gutsy when it came out, chances are the people still using Feisty are using it because they think that the bug with Gutsy is still there. I'll note that Gutsy did seem to have a lot more bugs on release than the other versions of Ubuntu that I've used (Edgy, Feisty) but most of the ones I've seen have been cleared up by now.
While it's nice that updates come so quickly, they do not do so much to get rid of the impressions the older bugs make. From my experience, when people try out a new software and it doesn't work or there is something wrong with it, they will assume that the problem will stay there for a long time. If there was a bug with Gutsy when it came out, chances are the people still using Feisty are using it because they think that the bug with Gutsy is still there. I'll note that Gutsy did seem to have a lot more bugs on release than the other versions of Ubuntu that I've used (Edgy, Feisty) but most of the ones I've seen have been cleared up by now.
Feb 19, 2008
The Paradox of the Free Market: Part II
As discussed in the last article, competition must exist in a free market but it is imperfect. Therefore, it is possible for one firm to gain an advantage over another firm, be it though innovation, patents, or some other form of having something the competition does not.
Because of the imperfect competition, some firms will gain market share and will eventually either be able to buy their competition or get rid of them through means such as advertising, price wars, etc. So in a further developed market, it is likely that there will only be a few firms.
Since there are only a few firms, we can call into play game theory. It is more profitable for all firms in a market to work together, effectively making a monopoly. This is called a cartel and is illegal in most countries (in a truly free market, this would not be the case, but again here is the paradox of the free market as the monopoly would control the market, making it no longer free). The problem with this approach is it is more profitable for a firm to then deviate from the cooperative state, providing it with more profits for a short period of time. Unfortunately then the other firm(s) will catch on and adjust their behaviour accordingly, thus returning the system to the competitive state it was originally in. So to prevent this and still make more profits, firms are capable of taking over other firms. They will acquire both the profits of the other firm, and they will gain more control over the market. Eventually this will lead to very few firms present in the marketplace.
Why would a firm want to be bought? One thing that is forgotten in economics (or at least the economics that I saw in my undergraduate degree) is that firms are not atomic, they are composed of individuals which may have different interests than that of the company. If the owners of the company see opportunity to make a profit for themselves by selling the company, they may do this. So there is incentive for both the buying company and the owners of the company being sold to complete the transaction, depending on the amount of money offered.
Because of the imperfect competition, some firms will gain market share and will eventually either be able to buy their competition or get rid of them through means such as advertising, price wars, etc. So in a further developed market, it is likely that there will only be a few firms.
Since there are only a few firms, we can call into play game theory. It is more profitable for all firms in a market to work together, effectively making a monopoly. This is called a cartel and is illegal in most countries (in a truly free market, this would not be the case, but again here is the paradox of the free market as the monopoly would control the market, making it no longer free). The problem with this approach is it is more profitable for a firm to then deviate from the cooperative state, providing it with more profits for a short period of time. Unfortunately then the other firm(s) will catch on and adjust their behaviour accordingly, thus returning the system to the competitive state it was originally in. So to prevent this and still make more profits, firms are capable of taking over other firms. They will acquire both the profits of the other firm, and they will gain more control over the market. Eventually this will lead to very few firms present in the marketplace.
Why would a firm want to be bought? One thing that is forgotten in economics (or at least the economics that I saw in my undergraduate degree) is that firms are not atomic, they are composed of individuals which may have different interests than that of the company. If the owners of the company see opportunity to make a profit for themselves by selling the company, they may do this. So there is incentive for both the buying company and the owners of the company being sold to complete the transaction, depending on the amount of money offered.
Feb 18, 2008
The Paradox of the Free Market: Part I
Again I will be deviating from the standard Ubuntu rants to go onto another subject of my interest: economics.
This series of articles will outline my view of the free market today and what I perceive as a gross paradox within it. These articles are not original, I'm sure other people have written about it before, but whatever. This is a blog, not a scientific journal.
I will begin with talking about what my view of a free market is and should be. The free market is where the markets for goods/services/etc. is free from control, by any party. Traditionally the free market has been defined as free from control by the government, however this is not truly free as it is still controllable by individuals or groups of individuals (called monopolies or oligopolies respectively).
In order for the market to be free, there must be competition. Without competition, one party will control the market, either a firm or a group of firms (called a cartel). The purest form of competition is called perfect competition, where all firms are equally small and equally incapable of affecting market prices or wages. If they raise their price, nobody will buy from them, if they lower their price, they will not make enough money to finance their business. Obviously this market structure is not present in reality, as even small businesses such as dépanneurs (convenience stores) hold a positional advantage over other businesses which leads to a market structure called monopolistic competition.
There are many factors which break down the structure of perfect competition: patents, innovations, natural monopolies, to name a few. This is especially true in the computer industry, where high fixed-costs and copyrights can provide severe barriers for entry to starting firms. On the other hand, it is far easier to innovate in this industry, which explains how companies like Google got to where they are.
In the next article, I will talk about the impact of imperfect competition in the long run.
This series of articles will outline my view of the free market today and what I perceive as a gross paradox within it. These articles are not original, I'm sure other people have written about it before, but whatever. This is a blog, not a scientific journal.
I will begin with talking about what my view of a free market is and should be. The free market is where the markets for goods/services/etc. is free from control, by any party. Traditionally the free market has been defined as free from control by the government, however this is not truly free as it is still controllable by individuals or groups of individuals (called monopolies or oligopolies respectively).
In order for the market to be free, there must be competition. Without competition, one party will control the market, either a firm or a group of firms (called a cartel). The purest form of competition is called perfect competition, where all firms are equally small and equally incapable of affecting market prices or wages. If they raise their price, nobody will buy from them, if they lower their price, they will not make enough money to finance their business. Obviously this market structure is not present in reality, as even small businesses such as dépanneurs (convenience stores) hold a positional advantage over other businesses which leads to a market structure called monopolistic competition.
There are many factors which break down the structure of perfect competition: patents, innovations, natural monopolies, to name a few. This is especially true in the computer industry, where high fixed-costs and copyrights can provide severe barriers for entry to starting firms. On the other hand, it is far easier to innovate in this industry, which explains how companies like Google got to where they are.
In the next article, I will talk about the impact of imperfect competition in the long run.
Feb 15, 2008
Love: SSH
An ancient technology that dates back to at least the 60's (Telnet has at least), SSH is a way you can connect remotely to a computer, log in and run commands as if you were on that computer. With X, you can even do X11 forwarding so that the graphical applications appear on your screen even though they are executing on a different machine.
This is old news for any UNIX/Linux user, but for those of us who are accustomed to only being able to use a computer while sitting at it, this is a big change. Although Windows has remote desktop capabilities, they're pretty limited and slow. Fortunately I have discovered this wonderful thing called Xming which is essentially a port of the X server to Windows. That means that with tools like PuTTY, you can use X11 forwarding to have your X windows pop up in Windows. Awesome! I tested it out with forwarding Kate to my Windows box at work, and although it was beastly slow, it worked. I could browse and edit my files on my machine.
This gives me much more potential and maybe I can even convince my bosses to let me work at home...yeah right.
This is old news for any UNIX/Linux user, but for those of us who are accustomed to only being able to use a computer while sitting at it, this is a big change. Although Windows has remote desktop capabilities, they're pretty limited and slow. Fortunately I have discovered this wonderful thing called Xming which is essentially a port of the X server to Windows. That means that with tools like PuTTY, you can use X11 forwarding to have your X windows pop up in Windows. Awesome! I tested it out with forwarding Kate to my Windows box at work, and although it was beastly slow, it worked. I could browse and edit my files on my machine.
This gives me much more potential and maybe I can even convince my bosses to let me work at home...yeah right.
Feb 12, 2008
Paradox of ideals
When it comes to Linux, the three things that I hear the most are this:
1) Free! As in speech. Anything that is not free is bad or should be avoided if possible.
2) Spread it. We want everybody to use Linux.
3) If there's something wrong with the system, it's your fault. If you can't use it, that's your problem.
The second one is the most common in my opinion. However, the first and third seem to conflict with the second. First, how can it spread if everything is free? In a society where money is king, there must be money for it to gain market share. So therefore, not all software for it can be free. Sure, there are benevolent companies like Canonical or Google that will support the software, but for the mass market to move to it, more companies need to adopt it. This means closed-source and paying (gasp!).
The third one is rarer than the others, but it seems to be the most common one when dealing with open-source zealots. At least in my experience. I've brought up flaws like fragility and counter-intuitive features, but generally I get told to "rtfm" (Read The Fucking Manual) or that I broke it and if I had known what I was doing, it wouldn't have happened. Unfortunately the 99.8% of the world who isn't a Linux enthusiast (like myself) doesn't have time or desire to "rtfm" or learn how to not mess up their system. This is why the third ideal contradicts the second ideal. While Ubuntu and Fedora and other distributions (I want to try PCLinuxOS when I have time, so sometime next summer most likely) have made excellent strides in terms of usability, the majority of the programs are still mainly unusable due to too many options, lack of non-technical documentation, or some other reason.
So Linux enthusiasts, in order to spread the word you must first accept that proprietary software does have a place in the world (I love getting paid to code) and that some people don't understand the concept of a man page or config file.
Note: I do believe there will always be a place for a pain-in-the-ass Linux distribution for power-users (aka Gentoo), which is part of the appeal of Linux to many users. I'm not suggesting that these change, simply the ones that are attempting to reach out of GNU-land.
1) Free! As in speech. Anything that is not free is bad or should be avoided if possible.
2) Spread it. We want everybody to use Linux.
3) If there's something wrong with the system, it's your fault. If you can't use it, that's your problem.
The second one is the most common in my opinion. However, the first and third seem to conflict with the second. First, how can it spread if everything is free? In a society where money is king, there must be money for it to gain market share. So therefore, not all software for it can be free. Sure, there are benevolent companies like Canonical or Google that will support the software, but for the mass market to move to it, more companies need to adopt it. This means closed-source and paying (gasp!).
The third one is rarer than the others, but it seems to be the most common one when dealing with open-source zealots. At least in my experience. I've brought up flaws like fragility and counter-intuitive features, but generally I get told to "rtfm" (Read The Fucking Manual) or that I broke it and if I had known what I was doing, it wouldn't have happened. Unfortunately the 99.8% of the world who isn't a Linux enthusiast (like myself) doesn't have time or desire to "rtfm" or learn how to not mess up their system. This is why the third ideal contradicts the second ideal. While Ubuntu and Fedora and other distributions (I want to try PCLinuxOS when I have time, so sometime next summer most likely) have made excellent strides in terms of usability, the majority of the programs are still mainly unusable due to too many options, lack of non-technical documentation, or some other reason.
So Linux enthusiasts, in order to spread the word you must first accept that proprietary software does have a place in the world (I love getting paid to code) and that some people don't understand the concept of a man page or config file.
Note: I do believe there will always be a place for a pain-in-the-ass Linux distribution for power-users (aka Gentoo), which is part of the appeal of Linux to many users. I'm not suggesting that these change, simply the ones that are attempting to reach out of GNU-land.
Feb 8, 2008
Can I go back?
Another article unrelated to Ubuntu. Maybe I should start a new blog...
Recently I started coding in C++ again. Over the last year since I've been out of university, I've been working primarily with PHP and Ruby for both personal and professional projects. In my last year of university, I didn't really do much programming as my courses were either in economics (I was planning on transferring but the money in computers told me not to) or high-level comp-sci/math courses where you're past the stage of coding everything and you're busy figuring out whether or not the program that you want to code is actually code-able or not and if so, will it finish by the time your grandkids are in the retirement home.
So back to C++. It is wonderful and painful at the same time. After working with Ruby, the thought of not having functions as first-class values, conditional assignments and if/unless statements after a statement is near-crippling (I realize now that in high-school and first year I was a Blub Programmer, am I still one?). But working with pointers again and manual memory management makes me feel like I'm actually doing something. If only there were a language where I can have both pointers AND anonymous functions. Maybe with some good type-inferencing like in Scala. There probably is one, but I don't know about it. It is my new mission to find out.
Recently I started coding in C++ again. Over the last year since I've been out of university, I've been working primarily with PHP and Ruby for both personal and professional projects. In my last year of university, I didn't really do much programming as my courses were either in economics (I was planning on transferring but the money in computers told me not to) or high-level comp-sci/math courses where you're past the stage of coding everything and you're busy figuring out whether or not the program that you want to code is actually code-able or not and if so, will it finish by the time your grandkids are in the retirement home.
So back to C++. It is wonderful and painful at the same time. After working with Ruby, the thought of not having functions as first-class values, conditional assignments and if/unless statements after a statement is near-crippling (I realize now that in high-school and first year I was a Blub Programmer, am I still one?). But working with pointers again and manual memory management makes me feel like I'm actually doing something. If only there were a language where I can have both pointers AND anonymous functions. Maybe with some good type-inferencing like in Scala. There probably is one, but I don't know about it. It is my new mission to find out.
Feb 7, 2008
A Code Igniter Review
UPDATE (Jan. 21, 2010): This review is almost two years old, which is a long time in the web world. It represents what Code Igniter was like 2 years ago, not what it might be like now. Try it out for yourself to see if you like it.
As usual, I will randomly post articles about stuff that has nothing to do with Ubuntu. This is one of those.
Recently for a work project I used a PHP framework called Code Igniter. It is designed to make your life as a coder easier while at the same time keeping a good amount of speed (compare this to other frameworks like Rails, which are slow behemoths).
Does it achieve this? Well since the project isn't completely done we haven't gotten to test the speed, but in terms of productivity I believe we actually lost time. We had used a small framework before that was whipped together by one of our programmers in half a day. His framework was very fast and very simple. Although it didn't to any automagic or URL routing or any of that junk, it was enough for us to implement a rudimentary MVC structure to keep things maintainable.
What is wrong with Code Igniter you may ask? Well it seems like it is fast because it strips out so much stuff for you. By default there is no support for a layout template in which your views are embedded (although a quick Google brings us to this), and models must be manually loaded. They don't connect to the DB automatically, so you must also do this. Therefore, you must also close the DB connection. Our programmer who was working on that part didn't realize it since our old framework would just open the DB connection at the start and close it at the end, we didn't have to include it in the controllers. So we ended up running out of available DB connections during development.
We have come to the conclusion that Code Igniter can draw a parallel to programming languages. Regular PHP programming is like using assembly or FORTRAN, it is very unstructured and spaghetti-like. Code Igniter is like using C, you have some structure but the system does absolutely nothing for you. Then other frameworks like Rails are like Ruby, you just worry about making the stuff and the framework handles the nasty details for you.
All-in-all, I feel that Code Igniter does not give you much productivity gain that you can't get by just arranging your code in an MVC structure. If you want a framework, you're better off with Symfony or Cake, or if you want to scrap PHP altogether (which I really want to do but work won't let me) you can use Rails or Merb.
UPDATE(05/14/2008): This post was written over a year ago, which is centuries in the open-source web world. I would not be surprised in the slightest that my points above are no longer valid, and that Code Igniter has become a much nicer framework. It would be better if you looked into it yourself and make a judgment call.
Unfortunately I doubt I will be updating this post for the newer version of Code Igniter as I have completely dropped PHP in favour of Ruby or Python based systems and will not likely be starting any projects written in PHP in the future, nor working at a PHP job.
As usual, I will randomly post articles about stuff that has nothing to do with Ubuntu. This is one of those.
Recently for a work project I used a PHP framework called Code Igniter. It is designed to make your life as a coder easier while at the same time keeping a good amount of speed (compare this to other frameworks like Rails, which are slow behemoths).
Does it achieve this? Well since the project isn't completely done we haven't gotten to test the speed, but in terms of productivity I believe we actually lost time. We had used a small framework before that was whipped together by one of our programmers in half a day. His framework was very fast and very simple. Although it didn't to any automagic or URL routing or any of that junk, it was enough for us to implement a rudimentary MVC structure to keep things maintainable.
What is wrong with Code Igniter you may ask? Well it seems like it is fast because it strips out so much stuff for you. By default there is no support for a layout template in which your views are embedded (although a quick Google brings us to this), and models must be manually loaded. They don't connect to the DB automatically, so you must also do this. Therefore, you must also close the DB connection. Our programmer who was working on that part didn't realize it since our old framework would just open the DB connection at the start and close it at the end, we didn't have to include it in the controllers. So we ended up running out of available DB connections during development.
We have come to the conclusion that Code Igniter can draw a parallel to programming languages. Regular PHP programming is like using assembly or FORTRAN, it is very unstructured and spaghetti-like. Code Igniter is like using C, you have some structure but the system does absolutely nothing for you. Then other frameworks like Rails are like Ruby, you just worry about making the stuff and the framework handles the nasty details for you.
All-in-all, I feel that Code Igniter does not give you much productivity gain that you can't get by just arranging your code in an MVC structure. If you want a framework, you're better off with Symfony or Cake, or if you want to scrap PHP altogether (which I really want to do but work won't let me) you can use Rails or Merb.
UPDATE(05/14/2008): This post was written over a year ago, which is centuries in the open-source web world. I would not be surprised in the slightest that my points above are no longer valid, and that Code Igniter has become a much nicer framework. It would be better if you looked into it yourself and make a judgment call.
Unfortunately I doubt I will be updating this post for the newer version of Code Igniter as I have completely dropped PHP in favour of Ruby or Python based systems and will not likely be starting any projects written in PHP in the future, nor working at a PHP job.
Feb 6, 2008
Hate: Ubuntu Kernel Upgrades
I love these when they actually fix something of mine, but that's rare for the most part. The main problem I have with them is that they mess up GRUB, and they mess up my NVIDIA drivers.
GRUB messes up because Ubuntu just sets everything to the default menu, which is Ubuntu, safe mode and memtest. I never use safe mode or memtest, so I usually get rid of those. I also have an XP partition which I would like to see in the GRUB menu, but it doesn't like to put that in either. This stuff isn't a huge deal, it's just a matter of changing the configuration file. Unfortunately from a less-technical perspective, a user has just lost access to their Windows drive and has no idea how to fix it. Not a good thing! I showed my girlfriend how to manually edit her grub/menu.lst file to put Windows back, but unfortunately sometimes she forgets how to do it correctly and fudges the whole thing. There goes the whole computer! Ubuntu: FIX THIS. You may hate Windows, but that doesn't mean the people who use your system hate it too.
Now the big part that pissed me off the most. I got a brand spanking new GeForce 8800 in December which runs beautifully. Unfortunately, the open-source NVIDIA drivers don't work very well for it. I can run at 640x480. Now, I'm not really a wizard with X, so when it comes to touch the xorg.conf file, I'm somewhat scared. I don't want to make things worse, which is usually what happens when I touch Linux's private parts (anything not in my home folder). So what I do is just download the wonderful little installer from NVIDIA's website and get that going, which for the most part fixes the problem. While this is good, it would be nice if you could run it from X and just have it complete its install on a boot-up script or something, having to CTRL+ALT+F1 and shut down X is not something the average user would like - or know how - to do. Anyway, I just wish that the kernel update would also keep my NVIDIA drivers intact.
EDIT: Apparently it wasn't just the video card drivers that the update messed up, looks like my printer is dead too. Fortunately I have an XP install that WORKS.
GRUB messes up because Ubuntu just sets everything to the default menu, which is Ubuntu, safe mode and memtest. I never use safe mode or memtest, so I usually get rid of those. I also have an XP partition which I would like to see in the GRUB menu, but it doesn't like to put that in either. This stuff isn't a huge deal, it's just a matter of changing the configuration file. Unfortunately from a less-technical perspective, a user has just lost access to their Windows drive and has no idea how to fix it. Not a good thing! I showed my girlfriend how to manually edit her grub/menu.lst file to put Windows back, but unfortunately sometimes she forgets how to do it correctly and fudges the whole thing. There goes the whole computer! Ubuntu: FIX THIS. You may hate Windows, but that doesn't mean the people who use your system hate it too.
Now the big part that pissed me off the most. I got a brand spanking new GeForce 8800 in December which runs beautifully. Unfortunately, the open-source NVIDIA drivers don't work very well for it. I can run at 640x480. Now, I'm not really a wizard with X, so when it comes to touch the xorg.conf file, I'm somewhat scared. I don't want to make things worse, which is usually what happens when I touch Linux's private parts (anything not in my home folder). So what I do is just download the wonderful little installer from NVIDIA's website and get that going, which for the most part fixes the problem. While this is good, it would be nice if you could run it from X and just have it complete its install on a boot-up script or something, having to CTRL+ALT+F1 and shut down X is not something the average user would like - or know how - to do. Anyway, I just wish that the kernel update would also keep my NVIDIA drivers intact.
EDIT: Apparently it wasn't just the video card drivers that the update messed up, looks like my printer is dead too. Fortunately I have an XP install that WORKS.
Feb 4, 2008
Love: Quanta
Quanta is a web development IDE for KDE. I use it for anything web development related, at least when I'm running under Ubuntu since there is no Windows version.
Here are some things that I look for in an IDE:
Quanta fits all of these. There are a few things that I would change, like importing files and folders into the project is a little clunky and their FTP is sometimes a bit clunky for uploading groups of files, but other than that it pretty much hits everything on the nose.
The main problem that I had with Quanta is that it is for KDE only, so if you want to install it on Ubuntu you have to install all the kde-base libs. Fortunately I use a lot of other KDE programs like Kate and KolourPaint, so it's all good. Second main problem is that by default (unless you're under KDE) you don't have the SFTP protocol installed, so you're limited to basic protocols like file:// or ftp://. With a little help from our friend Google I figured out you need to install the kio-base libs. It wasn't a huge problem, but anytime you must consult Google is considered wasted time in my opinion.
A note to emacs/vi users: Screw off. I want to spend my time coding instead of memorizing thousands of commands.
Here are some things that I look for in an IDE:
- Basics: Syntax highlighting, code folding, block indentation, bracket matching, etc. These are little things that help you out when you're programming. Oh, and make sure it intelligently handles tabs. If I press backspace, it deletes one character. Not 20. See PhpEd for an example of retarded tab management.
- Minimalist: Don't get in my way. I want to code without stupid paper clips or something popping up like, "It looks like you're writing an HTML file, let me help!" While code-completion is nice, don't be excessive about it. In PHP where you tend to have hundreds of variables everywhere, I don't want the code-completion thing giving me a list of twenty different things when I type "$num" or automatically finishing a word when I type something like "ret" (see Netbeans).
- Directory Tree: On the left side, give me a directory tree. I've spent enough time developing with Kate or KDevelop to know that having to double click ".." several times and then find the folder I want is very annoying.
- Built-in FTP: Mainly for web development IDEs. Many of the Linux FTP programs are fairly slow to use. gFTP doesn't use tree structure and so every time you want to move to a different folder, it's a bit of work. Even worse is the command-line sftp, with all the cd this and lls that, way too much typing to make me productive. I just want to hit Ctrl+Shift+U (or whatever your upload shortcut is) and be done with it.
Quanta fits all of these. There are a few things that I would change, like importing files and folders into the project is a little clunky and their FTP is sometimes a bit clunky for uploading groups of files, but other than that it pretty much hits everything on the nose.
The main problem that I had with Quanta is that it is for KDE only, so if you want to install it on Ubuntu you have to install all the kde-base libs. Fortunately I use a lot of other KDE programs like Kate and KolourPaint, so it's all good. Second main problem is that by default (unless you're under KDE) you don't have the SFTP protocol installed, so you're limited to basic protocols like file:// or ftp://. With a little help from our friend Google I figured out you need to install the kio-base libs. It wasn't a huge problem, but anytime you must consult Google is considered wasted time in my opinion.
A note to emacs/vi users: Screw off. I want to spend my time coding instead of memorizing thousands of commands.
Subscribe to:
Posts (Atom)