Programming without a computer-related degree
Published on: 2012-01-08
I used to be hesitant to admit that I didn't have a computer-related degree as a software developer. I told myself that it was because other people might think less of my ability if I admitted that I didn't have formal training, but the truth is that I felt like my technical skills were less than what they should be without the degree. After being in the business of software development for a while, I'm convinced that my lack of formal training is actually an advantage. Here's why:
As a self-taught developer, I can prove that I can learn on my own
When I first started getting paid for writing software, AJAX didn't exist, there wasn't much of a market for handheld device development, few open source tools existed in the .NET world, cross-browser compatibility was foremost in the minds of developers, and Adobe's Flash was about the only way to create truly interactive web site. Times have changed. Certainly there are similarities between what I do now and what I did then, but there are a lot of differences, too. Software developers who are unable to keep up with the latest technologies will find themselves obsolete in a short amount of time, so the ability to pick up new technologies quickly is vital to being successful in technology. Because I taught myself how to program, I'm used to the idea of learning a technology outside of my job. I can learn new technologies relatively easily as a result.
I'm not tied to any particular platform or approach
I'm rather surprised by the number of people in technology who cling to their particular tool of choice, usually for no good reason. These zealots think that everyone who uses a tool or approach other than theirs are incompetent. There seems to be a high correlation between people with these attitudes and people with computer science degrees. (I can understand that point of view – I had the same view repairing band instruments. I was taught the "best" way to repair an instrument by arguably the best repair school in the country. Why shouldn't I expect that everyone else should do it the same way?) No tool is evil, despite what the technology blogs say. Some are merely better suited for a particular purpose than others.
I find a problem and learn a technology to solve it, not the other way around
From the first day I started learning technology, I had a particular problem in mind when studying. At first, I was studying C++ and HTML in order to create a website for the music store I worked for. When I realized that C++ was rather outdated for web applications, I used the object-oriented concepts and memory management knowledge to pick up C# and ASP.NET. My SQL Server certification came after I realized that I needed a better handle on databases in order to create scalable applications. My explorations into jQuery and Java came when I started finding limitations with ASP.NET. It might not have been the most efficient way of learning things, but what I learn, I learn well. And yes, I'm constantly looking to expand my horizons by listening to other developers, even if they do have a computer science degree. :-)
I understand that there's more to software than just technology
If you read my blog regularly, you knew this was coming. Software teams will recommend tools based on what makes it easier for them to use, not which one results in a better user experience. As an example, I once heard a developer claim that every SharePoint project should be started and updated in Visual Studio only. His reasoning was that by allowing the user to create/update/change whatever they wanted, any particular SharePoint site would be prone to breakage and higher testing and deployment costs. While he was factually correct in his reasoning (that starting and updating all SharePoint projects in Visual Studio would reduce testing and deployment costs, as well as reduce breakage), his conclusion was developer-centric. If you know anything about SharePoint, you know that its primary strength for the average user is that it can be updated extremely easily by a non-developer. He was removing one of the primary reasons for using SharePoint for the sake of his own ease of use. The attitude of putting user's needs second to developer ease-of-use is pretty common. Because reducing software teams' efforts reduces costs, I'm always conscious of tools' ease of use, but I always try to keep that secondary to the end user's needs.
Computer Science degrees aren't all bad
With all that said, I don't recommend any young person trying to get into a technology field skip a Computer Science degree. I've certainly met my share of CS graduates who were able to understand the business behind the technology, and there are still places for the tech-centric employee. I would instead recommend getting the CS degree, but get a second major in a business-related field. Rightly or not, many hiring managers still believe that software development cannot be done well by someone without a CS degree. Plus, you do learn a lot of useful information while getting the degree. The fact that I was able to get the knowledge without the degree shouldn't minimize the importance of that knowledge. If you have the degree, be aware that there's a business world outside of the technology. If you are a hiring manager, be careful not to take the degree (or any degree) too seriously; it's the knowledge that counts.