Thinking of how to describe the tech industry to someone new is a bit daunting. There is a ton of stuff out there that I "just know", from the casual programming I did before college (10 PRINT "KIRK IS GREAT! "; 20 GOTO 10 ) to my formal computer science degree to what I've picked up by being in the industry 15 year since. It's difficult to guess what someone who is moderately tech savvy but not really a hobbyist knows or doesn't know.
That said, I thought I'd try to lay out my understanding of the field, what's hot, what's not, what technology is used where and how, etc. This is of course a highly subjective description, but I do welcome input from other veterans. (As I review what I've written, I realize this is pretty biased towards my own experience writing web-based software, vs stuff that runs on desktops or on mobile devices)
SOME BACKGROUND
Sometimes it's hard for Programmers to understand why every seems to think programming is so difficult. In general, it ain't rocket science; if someone can cook following a recipe, or maybe make their own recipe for someone else to follow, they should have enough facility to program.
Programming is A. getting the knack of breaking a complicated procedure down into tiny steps and B. getting better at imagining what can go wrong, what are the unusual cases or inputs or conditions when the program is running that might interfere with an otherwise straightforward bit of coding. And I guess C. learning what's out there, what wheels are available that you really shouldn't reinvent, and familiarity with the libraries and code bases you're working on. This blog entry is all about C, I guess.
TEMPERAMENTS
Back in college, I noticed a divide between Computer Scientists and Engineers (at my school this got so political that the subdomain for the department went from "cs.tufts.edu" to "eecs.tufts.edu" and back.) The former tend to be more interested in what can possibly be computed in a formal and abstract way, the latter more into what computers can be used for. Personally I'm definitely in the latter camp.
The divide lives on in the industry, but shifted. Here the term Engineer often refers to someone who takes a more formal approach to their code, they'll try to make sure it's reusable, well-tested, etc from the get go. This is in opposition to a Hacker mentality (an overloaded term, to be sure: it can also refer to people who like to break into systems, and to people who just like modifying technologies to work in ways other than it was first intended.) The Hacker feels that "premature generalization is the root of all evil" (i.e. you make a system too complicated by trying to create it in a reusable fashion while only being able to guess at what the future uses of it would be) and also that spending too much time and energy on tests is not a good investment, since you can only test for things that you think will go wrong anyway, and anyway there are formal studies that indicate and really good tester system will be as complicated as the system it was meant to test. (End self-defensive soapbox.)
TECH LANDSCAPES
In some ways, technology for web based stuff in the industry reflects (or is reflected by) peoples choice of desktop computers:
Windows - kind of old and boring but gets the job done. Microsoft made some really nice IDEs (see below) and some companies dig having such a big name behind the technology their techies are using. They tend to offer pretty powerful languages, but often don't make it easy to know what's going on "behind the scenes". (A lot of hackers don't like systems that are powerful but hard to understand, because they're tough to deal with when something breaks, or when you want to make them do something a tad beyond what they're first programmed to do.) Popular languages are ASP.Net and C# (pronounced C-sharp)... more on that stuff later.
Unix - Linux is the most popular flavor of straight "Unix" Unix is a very famous, clean, hacker- (and engineer-) friendly environment. (Tangent: Unix has a philosophy as "treat everything as a stream of text, and then do things by passing your stream of text through a series of little programs (via "pipes")) This is the native territory for Java, Python, and Perl.
Macs - Macs have always been a favorite for artsy and designer types. Technically (well, semi-technically) these days OSX is a flavor of Unix, so there's some real overlap with Unix. Still certain technologies are probably more popular here, like node.js and jQuery
Unix - Linux is the most popular flavor of straight "Unix" Unix is a very famous, clean, hacker- (and engineer-) friendly environment. (Tangent: Unix has a philosophy as "treat everything as a stream of text, and then do things by passing your stream of text through a series of little programs (via "pipes")) This is the native territory for Java, Python, and Perl.
Macs - Macs have always been a favorite for artsy and designer types. Technically (well, semi-technically) these days OSX is a flavor of Unix, so there's some real overlap with Unix. Still certain technologies are probably more popular here, like node.js and jQuery
TERMINOLOGY
Software tends to have different layers:Front End (UI/UX) Frontend is the part of the computer the user interacts with directly... the stuff that lives in the browser, that they see on their mobile device screen, etc. (UI is the nitty gritty details. UX is the more generalized approach to making a system that's easy for a user to understand.) This might refer to the stuff that actually lives in the browser, say (javascript and jQuery are popular there) or it might be the stuff that makes the webpages on the server before pushing them down to the user's desktop. (PHP, Ruby, for example, or ASP.Net on the Windows side)
Backend / Serverside - The part of the system that runs on the company's centralized computers. (There's some overlap with the idea of "The Cloud") For a mobile app, there may or may not be a server side component. But here much of the "heavy lifting is done". Java is really big here, and C# on the Microsoft side.
Database - when you need to store information for the long term, in general you use a database. Oracle used to be the big money thing here, but cheaper alternatives like MySQL and PostgresSQL have probably started to eclipse it. (SQL Server on the Microsoft side) Most of this stuff is SQL based, though there's a growing interest in "noSQL" alternatives (like MongoDB) that tend to be more based on documents rather than rows and columns)
There used to be a term "Midtier" but now I'm not sure what that means... maybe it's just another term for the serverside stuff that's not all the way back to the database.
This is most of the style of programming I've done-- but there's also a whole world of Desktop Applications (things that run directly on your Windows PC, Mac, or Linux Box) and lately Mobile Development (iOS and Android, mostly) has gotten a lot of excitement. Some of these apps will also have a server component-- for example, Instagram runs its own servers that its mobile apps talk to, and then also serve up webpages.
Sometimes you hear talk of The Cloud, which is just a simplified way of saying "a bunch of servers on the Internet". Some companies have started using Cloud-based services, like Amazon Cloud Service, to run the actual physical hardware that the server side software runs on. It's a great way for companies to save the expense of knowing how to run the damn things themselves.
OTHER TERMS
A completely incomplete and idiosyncratic glossary:
IDE is an expression that comes up a lot. It stands for "Integrated Development Environment" and what that means is a text editor, geared up to know more about the language you're coding in, plus connections to compile and run your code, and probably step through it while you're debugging. "Xcode" is the IDE for all things Apple, "Visual Studio" is the thing for Microsoft, "Eclipse" is popular for people doing Java.
API I'm not even going to Google what the initials stand for - it's just one of terms you use. An API is a library and supporting software that will handle a certain task for your program as you write it. For example, a Graphics API will give you functions to call to draw circles and lines and squares on the screen, so you don't actually have to write the code to draw just the pixels that are inside the circle you want (that API may then call another, lower-level API responsible for actually coloring each pixel, so that the Graphics API doesn't have to worry about talking with the physical screen hardware)
Abstraction - the API discussion is a good segue to this... at the lowest level, a program is 1s and 0s representing physical electrical states in a transistor. But it's way too hard to write complex things in 1s and 0s, so people invented computer languages that group various operations for the 1s and 0s into more human-like commands. In other words, even though you're still pushing around 1s and 0s, a computer language lets you right at a "higher level of abstraction". The code of the art program showing a rough model of a picture of the atom is higher level of abstraction that than the code that draws the circle, is a higher level of abstraction than the code that tells each pixel to be plotted, is higher than the code that tells the screen to be a certain color at a certain point in a hardware refresh rate (or whatever the hell screens are doing these days)
Design Patterns - giving certain names to high level ways of doing things, so developers can better talk to each other about how they made their program do what it does. For instance, a "Factory Pattern" describes ways of creating program objects without knowing all the details of the object you want to make, just how you want to use it.
Asynchronous - Man, now we're getting fancy! "Asynchronous" means you've triggered something happening, but you're code is not waiting around for the result... it will get "called back" when the action finishes. If a program is Synchronous, sometimes everything in its little world freezes until the action is done... this can lead to poor UX (See above)
Scaling - writing software and setting up systems so they "scale" means making it so many users can use it at once. "Does it scale?" is an important question. Back in history, the site "Friendster" fell by the wayside in part because it couldn't handle all the people who wanted to use it at once, and more recently Healthcare.gov also had "scaling issues"
HOW TO LEARN THINGS
One question is about how to learn things, pick up technologies.
This can be a bit of a chicken-and-egg problem: often a company will hire for a knowledge of a specific technology, but often an environment like a job is the best place to learn by making something in the real world; so you need to know tech to get a job, and you need the job to teach you the tech. So in practice is you can get hired for other knowledge or general competency, and then pickup the technology there. Or, often a project will add a technology as it grows and discovers new problems with the tech stack emerge.
There are other methods as well. A popular suggestion is to start your own open-source project, put it on GitHub. Personally I think this is kind of big and intimidating, but then again it took me a long time to realize the open source software can as full as dodge-y code as the "professional" stuff.
Even though over the past 3 or 4 years I've moved into the front-end, in 2009 or so I was behind with the developments that were letting more and more exciting stuff happen in the browser itself. Something I found (though after I used the "get hired for basic competency, then push" trick) are the newsletters of Cooper Press, especially "Javascript Weekly". This comes to your email inbox and is a great way to keep your finger of the pulse of what's going on in the tech space.
As a side note, another development of the past half-decade is the emergence of "Stack Overflow" as the repository of answer to a million little tech questions. I mean, for over a decade, "I guess I'll Google the error message before beating my head against this" was a tremendous tool in every developer's arsenal, but more and more the most relevant results are at Stack Overflow. (But since you use Google to get there anyway, this is more an observation/prediction of what will happen to the new developer, rather than advice of where to go. Presumably said developer already knows about Google.)
There are also a lot of good tutorials out there, and courses by Stanford and other universities, and stuff like Code Academy. I don't have as much experience with these... I do know sometimes there's a large gap between that kind of simplified tutorial and "the real world", so I would say if one of these courses seems easy, it's probably not doing its job thoroughly enough.
LANGUAGES/ENVIRONMENTS
In general, there are 3 things that set languages apart from one another:
1. The syntax and basic rules of it. It's been demonstrated that any popular language is about as expressive and powerful as any other one, but there are enough small differences that lateral changes can be tough. (Also, a language switch ideally gives a new angle to your perspectives on coding in general.)
2. The libraries and frameworks that are popular for it. Now, every language ends up getting a TON of small frameworks-- often by aspiring coders trying the "start an open source project to learn and put on resume" trick. My advice is to avoid these, unless it's clear it REALL solves your problems... which leads me to a digression:
Another divide is in techies who like to build things from scratch vs finding a pre-existing library and using it. Sometimes this decision is referred to as "Make vs Buy" -- even if the "Buying" is something free, you might still be paying in terms of how the time spend to learn how to use the library properly.
PROS FOR BUY
- code is already written, duh!
- ideally code is well-tested by someone else, so you don't have to take on that cost either
- the cost of code is not just in the writing, but in the maintaining, and passing on to more people
PROS FOR MAKE
- writing code is fun (I think this one shows up to much for me)
- a prebuilt library solves your problem, sure, but probably 9 other problems you don't have. Therefore this is added cost in learning how to configure it.
- And when you DO configure it poorly (because lets face it, if you weren't under time pressure you're probably not in business), often it's tricky to debug, because the mistake you made is HERE, in the configuration file, but the reporting of the error is WAY OVER HERE, in the stack trace, and it's even odds if the problem will make sense to you, because the whole point of the Buy was to not understand the problem as deeply!
So, my biases tend to towards Make, but again I know some of that is just because coding is fun. When I do use a library, I have a strong preference for things that follow the Unix-like philosophy of "do one thing, and do it well", industry-leading platforms (like jQuery in the browser space), and things that are isolated libraries my code calls while still running the main program flow, vs advanced toolkits that are configured to manage the flow of execution.
BACK TO LANGUAGES
Disclaimer: these are just my rough opinions from a decade or two in the industry. I might be missing big things, either because of my own experience or just because I'm not good at sitting and thinking of comprehensive lists.
C/C++ - Popular with people do things on Linux, and low-level (close to the hardware) type stuff. My first university classes were in these, but the languages were so primitive at the time that my opinion is probably unfairly biased against them. But they were a good start for me, because C-like syntax (with curly braces used to set aside blocks, arguments to functions in parentheses, return statements from functions and semicolons to explicitly say where each line of command ends) has influenced many other languages, Java, Javascript, Objective-C, and many more.
Java - Probably the most Java opportunities are for writing serverside code for big companies that have relatively comfortable amounts of time to "do the engineering right" This is where I was for much of the first decade of the 2000s. I also use it for toys I write, using a specific IDE/toolkit called Processing (see http://processing.org/ that provides an IDE and an API for artists and students to make cool, interactive things.) Also, programming for Android devices (phones, tablets, etc) is mostly in Java.
Javascript - Javascript is kinda like Java just like Rootbeer is kinda like Beer. This language is often thought of in conjuction with HTML5, more on the below. For years, it was kind of looked down on as a toy language for doing little tricks in the web browser, but over time that became amazing stuff like Gmail and Google Maps - the browser does more of the work, and doesn't have to wait for the server to populate ever little bit of the page. These days, it often goes with jQuery, which is a powerful API for manipulating stuff in the webpage (sometimes called "The DOM", Document Object Model, meaning the objects that represent the webpage the person is looking at) and for sending messages back and forth to the server. (Sometimes that's called AJAX, an acronym involving the asynchronous (see above) nature of a browser not freezing up while waiting for the server, and the messages are in JSON, see below) Lately there's been a move towards also having Javascript be the language for the stuff on the server -- there the most popular toolkit seems to be node.js. (except for jQuery, all these cool javascript libraries call themselves something.js)
2016 Update: I've been showing this article to a few people trying to get into tech, and so I guess it behooves me to keep it a little up to date. So I'm saying, I'm stunned at how popular Angular.js has become- I've rambled about it at length, and I am still surprised people are willing to give up so much transparency and decoupling of view and model into what's going on for a bit of componentization and not having to read/write form data fields (I think Google's "endorsement" (not that they use it, but hey) gave it an edge even when more mature things like Ember and React came out while Angular was still struggling to reach 2.0 helped a lot)
2020 Update: "React" seems to have been winning a lot here, especially when paired with Redux. I'm happy, of the Ember/React/Angular set, React was my clear favorite.
TANGENT: XML vs JSON
When a browser talks to a server, or two computers talk to each other in general, they have to figure out how they're going to wrap up the data they want to send to each other. "XML" was popular in the early 2000s... it looks kind of like HTML (see below) and was very careful about labeling its fields so it was clear just what the data it carried meant. It was also generally a big pain in the ass. "JSON" is the new hotntess in a lot of ways... it's a very simple and easy to read way of including lists, and key-value-pairs, and lists of key-value-pairs etc, and is easy to work with. It also happens to be the way you'd setup any data in Javascript code.)
HTML5/CSS - this is a different thing than most of the "languages" in this list, because it's something you might write a webpage in, not a program. HTML has long been how you write webpages, it has all these <tags> that wrap around the text content of the page, and carry the references to other things like images that might be visible on the page. "HTML5" is a hip buzzword, and in part it means the HTML document should be JUST the content and a minimal amount of tags saying what it is, and most everything that makes it pretty and controls how it looks should be in separate CSS files... CSS is a set of commands for describing color, border, position, etc etc of things that are on the page... see CSS Zen Garden for a clear idea of how different you can make a page by keeping the HTML the same, but changing the CSS (and maybe more importantly the image files the CSS refers to)
Ruby on Rails/Python/node.js/PHP - I'm lumping these together, even though that reflects my own history of "things I think are cool but don't know much about". These languages (well, 2 of those are specific toolkit) are very cool and powerful ways of writing server code... very popular with those little San Francisco and Cambridge and New York startups. It used to be COBOL was the way old business language (that you could still make a living on, especially around Y2K) and Java was young and hot, but now these are to Java what Java was to COBOL.) PHP is a little older and clunkier than the others, but still gets some play.
Perl - just like the Javascript section was bigger than the "hip server language" section because that's where my heads at these days, Perl gets a mention more because of what it meant to me than the industry right now. (Though I've heard it's making a revival in Japan, of all places.) Perl is a weird language that borrowed a lot of syntax and style from various languages, all Unix-based. When I came to it from C, however, it was a revelation. It taught me (and I'm listing these as things a new programmer should know a bit about):
- Maps - these are the key-value pairs I mentioned before, a way of linking data with what it means. You give me the key "first name" I give you the value "Kirk", you give me the key "city" i give you the value "Arlington". This is a simple but powerful idea that shows up again and again in computer land.
- Strings - the C I was learning didn't have these in as easy a way. A "String" is just any bunch of text.
- Regular Expressions - a fancy way of breaking up a string and analyzing it. For example you might have a regular expression for currency, that would say "yes" to "4.33" or "$8" or "$1,121,55" and "no" to "$SK.SA" or "121.86.32"
- Memory Management - this is less of an issue than it used to be, but in some languages if you start using a piece of memory, you also have to let go of it when you're done, or else the computer program holds onto it forever. (This is called a memory leak.) Many programs take care of this for you, and as you gain experience as a programmer you'll learn when you have to pay attention it or when you can ignore it.
Perl is supergood at handling giant hunks of text files efficiently, and some use for it has come up in every job I've had.
Objective C - the kind of unusual language Apple uses. Used almost only by people who write stuff that runs on a Mac or iOS device, but that last group has EXPLODED in numbers over the last half-decade. Generally with the XCode IDE.
.Net/C# - This is to Microsoft what Objective C is to Apple. There is a LOT of businesses out there that code with stuff, they have some excellent certification classes so if you have the time and money to go through with that, you can absolutely get a good foot up and start a career that way. (Again, I'm not writing more about it because I've had less experience here.)
SQL - a popular language for interacting with databases in, each platform (Oracle / Postgres / MySQL / Miscrosoft Sequel Server ) has its own flavor. You should also be aware that there's a NoSQL movement of ways of storing data permanently that follow different patterns than SQL's Relational Database model (all these multiple, excel-like tables connected to each other via keys)
2020 Update: I don't know where to put this in, but some other languages/systems that are popular include Hibernate, which is a way of putting data in databases, Kafka, which is passing messages around a bunch of servers, so that the whole system can handle much more load, and then containerization is a big thing... when you're doing stuff on the cloud, you're making these little virtual computers on the rented big server...
SO IN CONCLUSION
This might be a work-in-progress that I will come back to, and I welcome feedback from experienced coders and aspiring newbies alike. What do you wish someone told you when you were starting out? Let me know and I'll add it here.
No comments:
Post a Comment