How to Pick a Programming Language to Learn Today

CommanderFrank

Cat Can't Scratch It
Joined
May 9, 2000
Messages
75,400
Looking to fast track your lot in the IT world by learning to code? The major hurdle is picking out a language that will fit you and your job requirements. To help with this problem, Makeuseof consulted the opinions of several resident coders for practical career advice.

If someone was looking to learn programming purely for the purposes of a job, they would fail. If they were looking just because they like the sound of learning how to program something, it wouldn’t matter what language they chose as long as it could spark their passion.
 
When I was in school doing physics/astrophysics one teacher had is write a simulation of a star collapsing and he taught us using IDL, at the time I thought it was so cool and neat, and before you knew it I was doing everything at school in IDL. However later on in grad school I needed to write some simulations, figured I'd do a lot of work at home only to find out that in order to compile the program I'd have to either 1) buy a license, or 2) log into the school servers upload program and compile/run it there hoping there wasn't too much graphical stuff that it would pop up. Then wished I learned C++ or some other similar language that didn't require me to pay to compile it.
 
once you pretty much have a firm grasp of concepts and moderate knowledge in a language, you can fairly easily move into any other language as long as you are up to a learning curve of what it is.
 
C++ is a good start. You'd be surprised how many languages seem similar to C++. Once you have a basic understanding of one language, the others are easy to pick up. You're just looking for how things are done differently in the other language. They're all essentially the same thing with different syntax.
 
Heh, I am starting python 3 myself and then moving to C++.Actually I am starting today once I get downstate. :D
 
Doesn't the same rule apply to know one universal high level language (C), one low level (assembly), Java for universal mobile and SQL?
 
once you pretty much have a firm grasp of concepts and moderate knowledge in a language, you can fairly easily move into any other language as long as you are up to a learning curve of what it is.

In theory.

But it is a lot easier to stay in a language family, where the syntax is similar.

I like the C syntax languages (C/C++/C#/Java/Javascript).

The similarity of syntax makes it so much easier to recognize coding structures.

Conversely I despise languages/scripting that doesn't use braces, the code is just so much less readable (Visual Basic, VBS, TCL etc...).

Luckily C syntax languages are fairly dominant everywhere. Javascript is the scripting language of the internet. C/C++ is the language of to the metal coding. C#/Java are great general purpose languages with GC.
 
I decided to give python a shot. I'm in microbiology and my father programs in C++ for a company. I wasn't aware there were multiple levels of it, so far it seems straight forward from what I've seen and read. Not sure what I'll do with it...but something eventually.
 
In my experience the languages you MUST know are C and its derivatives, sql and maybe python for simpler things.

And while you are at it, you also want to be fluent in powershell
 
You should probably concentrate on learning paradigms and design patterns rather than languages. Pick a paradigm first, and then pick a language that does it well. The normal path to learning paradigms is probably imperative->functional->object-oriented for full blown computer-scientist types, but if you're picking up programming as a secondary skill you can probably start with what you think fits your needs best. Imperative tends to be about writing algorithms. Object oriented is more about modeling relationships.
 
When I was in school doing physics/astrophysics one teacher had is write a simulation of a star collapsing and he taught us using IDL, at the time I thought it was so cool and neat, and before you knew it I was doing everything at school in IDL. However later on in grad school I needed to write some simulations, figured I'd do a lot of work at home only to find out that in order to compile the program I'd have to either 1) buy a license, or 2) log into the school servers upload program and compile/run it there hoping there wasn't too much graphical stuff that it would pop up. Then wished I learned C++ or some other similar language that didn't require me to pay to compile it.
IDL is very commonly used in my field as well(Atmospheric Science), particularly among those whose research is related to remote sensing.

My first programming language was Python, which I learned about 6 years ago, and through formal CS education I later learned other big system languages, Java, C++, and C.

I think in the software developing world, scripting languages like Python and some newer things like Clojure are gaining a lot more popularity due to their simplicity, and with the power of hardware today the performance advantages of compiled languages like C or even C++ are becoming less necessary, with the exception of game development which is still largely C++ I believe.

That said, in the scientific world, particularly in the realm of numerical modeling, compiled languages are still the norm. The language of choice though is not C or C++, but rather Fortran which seems obsolete from a software engineering perspective, but it does have some advantages over C in this type of application. I still use python for data analysis, but for my high performance computing needs Fortran (and occasionally C) are the most useful languages.
 
I actually started with a non-traditional approach to programming starting with matlab since my undergraduate education was in applied mathematics.

Where as trying to computing in low level programs like c,c++, c# would be almost impossible to do to simulate stochastic processes and evaluating mathematical expressions.

C++, C#, Java, SQL (never learned C or Python or Fortran) was learned later on. But I have no interest in low level programmings anyways unless I plan to work on creating the very fundamentals of something. Something more of the lines of a CS major.

My work is mainly based on matlab, R, SAS, and maplesoft.
 
My first exposure to programming was MATLAB. It was a required course for all engineers at my school. Sure, my high school had JAVA and C++ courses, but I wasn't interested then.

Well, into my 2nd year of college, I realized that I could do my lab assignments faster if I wrote some code to do all my computations. By avoiding using Excel to process large data sets, I was able to do things faster and better. Come my involvement in some undergrad research, I had to learn FORTRAN to run some CFD code. That became the stepping stone for an all expense paid grad school and I soon had to learn Python, Bash, C++, and XML; all for an in-house CFD code development.

Come to the present, with my 2nd job (and in the field I studied) I learned Perl and relearned VBA. Now, everyone at work thinks I like to code. Yes, I do, but only because it lets me get my job done faster and better than if I did not code things. Best example would be when my bosses went to a customer meeting and ended up emailing me saying that the deadline was moved up 2 months. All the work that had to be done in 4 months now had to be done in 2 months. And since these were CFD cases that each took several days to run, it cut into my post processing and analysis time. Imagine processing, analyzing, and creating presentations for 30+ cases; all within 2 weeks. Yup! I got it done. And, I think I even got an untold bonus! Now, everyone looks to me for code help. All the new projects that require extensive work come to me because I usually find a way to script them up or expand our team's capabilities.

This is why I code. This is why I continue to learn to code. This is the power of code.
 
i write everything in malbolge. my colleagues complain about bad documentation, but hey, they're just lazy.
 
For people teaching themselves, I recommend Ruby or Python. Syntactically, both are easy languages to understand. As for choosing a language, it all depends on what you want to do. If you are going to write web services Ruby is pretty heavily used. If you are going into application programming Python is a great starting place and then you can move into C++ or C#. It also comes down to tutorials. I haven't come across many good tutorials for C++, but there are tonnes for Ruby and Python. As already noted once you learn one language it is pretty easy to learn another. The hard part of programming isn't the language it is algorithms and logic.
 
Started learning C++ recently. I've always wanted to learn computer programming but always kept putting it off. Wish I didn't, it would make finding a new job way easier if I already knew this stuff...
 
For people teaching themselves, I recommend Ruby or Python. Syntactically, both are easy languages to understand.

For syntax I find {} delimited languages easier than whitespace delimited languages.

Braces make it dead obvious where control/functions end/start. They end ambiguity. Make it easier to cut and paste in code (zero worries about matching the whitespace).

Significant Whitespace/Indentation can be quite a bit less obvious, and lead to more obscure bugs, mixing tabs/space, copy pasting code...

This is one of those holy wars of coding, and I come down firmly on the side of significant white-space in code, being an invention of the Robot Devil.

the-robot-devil_pictureboxart_160w.jpg
 
I read that article twice and it didn't help one bit.

The one sentence that made the most sense was:

"Don’t Learn a Language: Learn Software Design"
 
I still miss Turbo Pascal.

Back in the 80's, having that and Bob Ainsbury's "Technojocks Turbo Toolkit" could make you look like superman :)
 
Had a class on Fortran & C++ at school. Use Python now for instruments & database stuff. I like the whitespace and not having to compile the code to run.
 
C is the best language to start in. C++ is a mess and overly complicated (NIH syndrome) and higher level languages hide too many things.

You can learn to program in something like Python but you can't learn to *program* because you have no idea what is going on underneath the hood. C exposes the programmer to many lower level idioms (especially when you do inline asm) and to computer hardware in general so people learn things like the difference between the stack or the heap or how to optimize programs for a given hardware architecture.

Unfortunately, the vast majority of new programmers today are completely inept because they never learn anything lower level than Java or Python or C# (it is a sad state when companies complain that, say, Scala, is too hard a language for their "programmers") and so they really have no idea what is going on.
 
You should probably concentrate on learning paradigms and design patterns rather than languages. Pick a paradigm first, and then pick a language that does it well. The normal path to learning paradigms is probably imperative->functional->object-oriented for full blown computer-scientist types, but if you're picking up programming as a secondary skill you can probably start with what you think fits your needs best. Imperative tends to be about writing algorithms. Object oriented is more about modeling relationships.
I agree with this approach. When I went to college we jumped right into C++ and recognizing structure and keywords. Unfortunately I think that led to a lot of students writing garbage code because they were writing in a brute-force manner to achieve the desired result. Object-oriented and data structure paradigms were not taught until we had a more advanced understanding of the language itself, at which point ASM was introduced. I took the time to study the paradigms more in-depth myself, and I'm glad I did. I was able to understand and appreciate the structure and process a lot better and was able to develop my own best practices.
 
You should probably concentrate on learning paradigms and design patterns rather than languages. Pick a paradigm first, and then pick a language that does it well. The normal path to learning paradigms is probably imperative->functional->object-oriented for full blown computer-scientist types, but if you're picking up programming as a secondary skill you can probably start with what you think fits your needs best. Imperative tends to be about writing algorithms. Object oriented is more about modeling relationships.

That's absolutely correct!

So many devs' think they're good at xyz because they know the language.
For example, knowing javascript only gets you to the starting line for event-driven browser apps. Or, understanding java isn't qualification for distributed systems.... (etc).
 
You can learn to program in something like Python but you can't learn to *program* because you have no idea what is going on underneath the hood. C exposes the programmer to many lower level idioms (especially when you do inline asm) and to computer hardware in general so people learn things like the difference between the stack or the heap or how to optimize programs for a given hardware architecture.

Unfortunately, the vast majority of new programmers today are completely inept because they never learn anything lower level than Java or Python or C# (it is a sad state when companies complain that, say, Scala, is too hard a language for their "programmers") and so they really have no idea what is going on.

I'm going to argue that OO and imperative are almost completely unrelated skills. You can have the opposite problem you are describing. There are a older programmers who are great at imperative/low level, but getting them to do OO is like pulling teeth and you end up with garbage objects and classes. Trust me when I say you do not want someone trying to think about heaps and stacks while they're doing Java. Some of the things I have seen I still have nightmares about.

A current CS student or new software engineer should be good-enough at both paradigms and should be able to select the right paradigm(s) and patterns for the job at hand. But you can have people who are just excellent programmers and are specialized at being good at one or the other who can still create valuable solutions.
 
I'm going to argue that OO and imperative are almost completely unrelated skills. You can have the opposite problem you are describing. There are a older programmers who are great at imperative/low level, but getting them to do OO is like pulling teeth and you end up with garbage objects and classes. Trust me when I say you do not want someone trying to think about heaps and stacks while they're doing Java. Some of the things I have seen I still have nightmares about.

A current CS student or new software engineer should be good-enough at both paradigms and should be able to select the right paradigm(s) and patterns for the job at hand. But you can have people who are just excellent programmers and are specialized at being good at one or the other who can still create valuable solutions.

One of the biggest problems in computer programming is the fact that generally management is clueless (which is why they are management; anyone who is actually competent at programming is too valuable to take away from programming) and they run everything off of buzzwords and hype.

Object-oriented programming has its uses but it is no panacea and there is an alarming tendency to use OO for things where you are better off doing imperative or functional programming simply because some idiot manager read some article from some idiot technology writer that said OOP is the best thing since sliced bread. And you still need to know how the computer works in order to program effectively, even with OOP.
 
I started with Basic on an Atari 64K machine. My first program grew into a fully capable tool for estimating costs, materials and labor, for bidding sheet metal work, like air conditioning a hospitol or school building. I did it for my Dad and his business and he used it for more then 10 years after. That was in 1985.

I loved it but I realized a career probably wouldn't be like that job was. A career was probably going to be sitting in a cubicle writing a segment of code for a project where your piece has to fit with 8 other guy's pieces, or writing new cash register machine code, or some other boring shit. I dropped that idea in it's infancy.
 
One of the biggest problems in computer programming is the fact that generally management is clueless (which is why they are management; anyone who is actually competent at programming is too valuable to take away from programming) and they run everything off of buzzwords and hype.

Object-oriented programming has its uses but it is no panacea and there is an alarming tendency to use OO for things where you are better off doing imperative or functional programming simply because some idiot manager read some article from some idiot technology writer that said OOP is the best thing since sliced bread. And you still need to know how the computer works in order to program effectively, even with OOP.

From my viewpoint there's no overlap between problems that can be solved with OOP or imperative. OOP is in my mind a management tool for creating and maintaining models of things. You shouldn't (or can't) do algorithms and functions with it.

On the other hand if you try and create a manageable model of some system of things with imperative or functional I believe what you'll naturally progress to - perhaps without realizing you've done it - will be some implementation of OOP's core concepts.
 
From my viewpoint there's no overlap between problems that can be solved with OOP or imperative. OOP is in my mind a management tool for creating and maintaining models of things. You shouldn't (or can't) do algorithms and functions with it.

On the other hand if you try and create a manageable model of some system of things with imperative or functional I believe what you'll naturally progress to - perhaps without realizing you've done it - will be some implementation of OOP's core concepts.

OOP is absolute overkill for simple programs. If I am writing a simple command line utility, to say, count the number of words in a file, OOP would simply do nothing but increase development time with no real benefit. One of the (many) things I hate about Java is the sheer amount of boilerplate code you have to do to get even the most basic of programs. (Compare Java to Scala for just how ridiculous the verbosity is in Java : http://stackoverflow.com/questions/...here-scala-code-looks-simpler-has-fewer-lines)

Personally, I'm far more interested in functional programming than imperative. Functional programming's focus on avoiding side effects makes it ideal for multi-threaded workloads and the reduction in side effects means less possibilities for bugs.
 
Back
Top