Want to relearn C, what book?

bob4432

Limp Gawd
Joined
Aug 16, 2005
Messages
186
Title says it all, learned C probably 15 yrs ago and never got a chance to really use it, so, well, can not program in C. Had an old book but figured there are so many technology differences between then and now, with all the different platforms, etc. I may as relearn with a strong core from this century.

My goals for your suggestions:
- Relearn good core C programming principles. I have a feeling a lot of this will come back to me, but I would like to make sure I will be guided correctly.

- I use multiple platforms, so will be programming for PC and also iOS, Android and other embedded systems where I have designed the hardware and would like to write my own code for it.

Please advise,
Bob
 
Hello Bob, I'm starting to relearn c now, from around 20 years ago. I've just enrolled on a software development course although not actually had a lesson yet.

The books they have recommended for the reading list include C Programming in Easy Steps by Mike McGrath which I have started working through and C All-In-One Desk Reference for Dummies which I've not really looked at yet.

I will work through C Programming in Easy Steps as it's easy to read and under 200 pages I think it will be a good introduction to C.
 
K+R ?
That one I used in the last century to learn C.

But you will struggle more with different libraries for each target platform ;-)

E.g iOS only C ? Not sure if that would be enough. Yes for libraries, not so much for UI (at least I never tried, only ObjectiveC or recently Swift)
 
ABC (A Book on C) was good back in the day. Not sure if it's still available.
 
Thanks for the recommendation I had a look on amazon uk and someone was selling a copy 2nd hand for 2 pennies! (it is £40 new) so I've ordered it.
 
iOS' "Objective C" is very different from classic C and Android is mostly Java so a book on C might just be the first of many. Although maybe one in an object-oriented language might be better because C's procedural concepts are pretty unpopular these days (Objective C, C++, Java and C# are used more frequently than C now). C is mostly relegated to embedded applications now.

Not that I want to discourage you.
 
Bob, I am going through a similar process myself. I was a VB 4, 5, and 6 programmer back in the day. Did some PHP when I was a webmaster. Did a little Java in college. Finally decided to give C a 5th or so shot in my free time. I think I have forgotten enough of the other languages that I can get used to C's quirks now. I actually like that it doesn't ram OOP down your throat like Java does, that probably comes from my VB roots. This isn't what you asked for but this website
http://www.cprogramming.com/tutorial/c-tutorial.html
does go a little more in depth than, "copy and paste this, now compile it, if you don't understand what that means don't worry about it you can learn that later."
 
atom,
Thanks, definitely not looking for a copy/paste thing, like to know the "why" of what I am doing.

Bob, I am going through a similar process myself. I was a VB 4, 5, and 6 programmer back in the day. Did some PHP when I was a webmaster. Did a little Java in college. Finally decided to give C a 5th or so shot in my free time. I think I have forgotten enough of the other languages that I can get used to C's quirks now. I actually like that it doesn't ram OOP down your throat like Java does, that probably comes from my VB roots. This isn't what you asked for but this website
http://www.cprogramming.com/tutorial/c-tutorial.html
does go a little more in depth than, "copy and paste this, now compile it, if you don't understand what that means don't worry about it you can learn that later."
 
Kernighan and Ritchie gets my vote as they should know the inner guts of the language. :) And just going for it.

I taught an intro to C programming class for several years. It's a good book that is too advanced (or, rather, dense) for beginners, but probably good for someone who's already versed with programming in other languages.
 
Learning programming is very much like learning math. You have to find texts that work for you and enable you to learn the way you learn. Unfortunately, like mathematicians, programmers tend to suffer the inability to communicate clearly in the English language and, university texts are rarely the best since they are used as money makers for professors/universities. I haven't found a better method than hitting the libraries and book stores to determine for myself if the perspective texts are worthwhile.
 
how come C & not C++? Do you have a target application you want to program for?

C is cool and all that, just it is not object oriented. If you are going to go through all that trouble, it should be for something relevant to your needs.
 
For OOP, I'd start at a minimum C# and work up to Java (both similar). C's procedural concept programming is kind of 'outdated' now compared to languages commonly used to write programs.
 
@ jiminator - embedded applications.

Just need to relearn the functions and want to make sure I code in the most efficient manner as I am limited in speed and memory with my lowest common denominator. I figure after relearning C, C++ shouldn't be too hard, but I need to first make sure I am good with C in a very efficient manner, and making sure I am using the correct functions for the job at hand. I will be reading/writing to/from memory address of the MCU and also various sensors, so I will have direct interaction with the hardware.

I should have addressed this more in the original post and I apologize for that.
Bob
 
K&R C
Expert C Programming: Deep C Secrets, Van Der Linden

the later goes into in-depth topics, examples, exercises, good supplement if you know the syntax and concepts pretty well but you want to nail them in C. The nitty-gritty like differences between arrays and pointers, chapter 8 is about linking. The late chapters give C++ examples I believe; offers C alternatives/if-you-hadda-do-it-in-C (C object being typedef struct with function pointers and all that good stuff).

I don't mean to hijack but any good books covering the nitty-gritty on makefiles? I learned in a Windows/Visual Studio env @ home and in junior college.


sorry for any typos, I damaged my radial nerve so typing not so well. :(
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
@ jiminator - embedded applications.

Just need to relearn the functions and want to make sure I code in the most efficient manner as I am limited in speed and memory with my lowest common denominator.
I don't understand what you mean by "lowest common denominator".

If I were to re-learn a language, I'd probably just use the same materials that I used when I learned it the first time around. Why isn't that an option for you?

If you want to learn C and have some programming experience, then The C Programming Language by Kernighan and Ritchie is a good bet. You can get a copy of The C Answer Book if you want help going through the exercises in the K+R book.

If you'd like more of a tutorial, you might try C Primer by Stephen Prata. It's more of an introduction and is more of a stepwise tutorial than the K+R book. It's appropriate for people with none or little programming experience.

You repeatedly mention "efficiency". It's not clear if you mean that you want to efficiencly write code (that is, code as quickly as possible), or if you want to write code that's efficient (that is, executes as quickly or in as little memory as possible. The prior, the art of programming, isn't something readily learned from a book. There are a few titles on the craft of programming, but you're probably better off focusing on your basic skills, first.

If you're concerned with runtime efficiency, then you'll want to study algorithms. You might start with Introductoin to Algorithms as it's the canonical school textbook. When you start needing your own alogrithms and solutions, a book like Skiena's Algorithm Design Manualhttp://www.amazon.com/dp/1849967202/?tag=mooseboycom-20 are helpful.

An appropriate choice of algorithm will give you the best performance control: doing something slow is slow, and can't be made faster with any shortcuts or tricks. A better algorithm is a smarter choice. If you really need to write code that's as efficient as possible, you'll attack the algorithm first. After that you'll need to focus on the specific architecture you're targeting to learn what performance characteristics it has and how to make them work to your advantage. That'll come from the programming manual for your processor, and what other documentation the vendor supplies. You might find books for the more popular platforms (like Intel). An interesting platform-agnostic (mostly) programming tricks book is Hacker's Delight.

I don't mention any online resources because I don't prefer to use them; I'm far more comfortable with books. Maybe others can recommend online resources that appeal to you.

Hope that helps!
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
@mikeblas
Thanks for the information. I should have clarified earlier that when I talk about efficiency, it is for code efficiency, not how fast I write the code as I know my speed will go up the more I use it.

As far as using the original materials, I honestly do not know what they were and since I never got to use what I learned, nothing stuck. I can look at a C program and read it to get an idea of what is going on but I do not remember all the functions, the correct syntax, etc.

Also, thanks for the info about the Algorithm references.
Bob
 
As many others in this thread have said, The C Programming Language by K+R is a really good book for writing in C.

As for a book about C library functions, you could consider The Standard C Library.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
For OOP, I'd start at a minimum C# and work up to Java (both similar). C's procedural concept programming is kind of 'outdated' now compared to languages commonly used to write programs.

This. Why waste time on an (almost) obsolete language.

Time better spent learning C# or another newer language.
 
This. Why waste time on an (almost) obsolete language.

Time better spent learning C# or another newer language.

Languages like C# and Java are cool and all (kinda), but if he's writing code for an embedded platform, C isn't a bad idea.

Plus learning C just isn't a bad idea in general. Too many kids now-a-days don't understand how languages like Java work, because they never learned how to do the things the platform is doing for them.
 
Languages like C# and Java are cool and all (kinda), but if he's writing code for an embedded platform, C isn't a bad idea.

Plus learning C just isn't a bad idea in general. Too many kids now-a-days don't understand how languages like Java work, because they never learned how to do the things the platform is doing for them.

Taking that into account, he might as well learn assembly! Or maybe even native machine code! :rolleyes:
 
Taking that into account, he might as well learn assembly! Or maybe even native machine code! :rolleyes:

Someone who doesn't know assembly (or machine code) doesn't know how computers work.
 
Someone who doesn't know assembly (or machine code) doesn't know how computers work.

I think this is an especially important thing to keep in mind, when we consider that the thread starter said he's programming embedded systems. On weaker devices that cannot afford to be as liberal with memory or CPU power, and that might not be running comprehensive frameworks/operating systems that can provide APIs to do all the low level tasks for you, it's especially important to understand what your hardware is actually doing.
 
Nobody "knows" machine code. It's fucking hexadecimal values. You're not going to sit down and start writing a custom ROM for your embedded system using machine code and a hex editor.

Everyone spend a few hours to understand assembly and how it works, but never use it again.

The OP wants to code on PC, iOS, and Android. Don't throw away years of common library development to learn assembly. Even C is borderline obsolete on those platforms.
 
The OP wants to code on PC, iOS, and Android. Don't throw away years of common library development to learn assembly. Even C is borderline obsolete on those platforms.

I read this:

....and other embedded systems where I have designed the hardware and would like to write my own code for it.

...and if he plans to do that, he'll need to know assembly pretty well, since he's going to need to be able to either program his devices in assembly directly, or write an LLVM backend.

Besides, even if that weren't the case, it's a good idea to have written an amount of assembly. The people who skipped out on learning assembly wind up being the people who have no hope of identifying that the problem they're having is a compiler bug, or other rare but deep problems.
 
The people who skipped out on learning assembly wind up being the people who have no hope of identifying that the problem they're having is a compiler bug, or other rare but deep problems.
Or, not-so-rare problems. Understand how the machine works is requisite to success in designing non-trivial algorithms. Sure, some people just want to learn to code as a pass-time, or get a job doing menial programming (not engineering, not computer science) tasks. Fine for them!

Other people are interested in understanding from the bottom up. I'd agree that those people are generally the ones who turn out to be more successful in the long run.

Meanwhile, nobody's suggested "throwing away years of common library development". That act isn't exclusive to learning assembly. Assembly and C don't become obsolete because they're about machine architecture, and therefore understanding computing at a fundamental level.
 
Or, not-so-rare problems. Understand how the machine works is requisite to success in designing non-trivial algorithms. Sure, some people just want to learn to code as a pass-time, or get a job doing menial programming (not engineering, not computer science) tasks. Fine for them!

And there is some truth to this, too. I have colleagues that did not pursue the lower level concepts and the theoretical subjects. They can write a lot of the simpler pieces of code, but occasionally they write completely non-performant code. In an extreme example, two of them were working together and they were organizing hierarchical data into a DFS ordering that was tied to a user clicking on a dropdown and showing the options in that order. In our QA region, their code was taking in excess of 10 minutes to run, and I was able to re-write it to run on the same data in under 0.6 ms (which was a much more reasonable time to expect our users to wait). I wouldn't want to sound like some kind of extremist, but at the risk of doing so a part of me believes their attempt would have been a lot better if they had a deeper background in the workings of the machine.

P.S., having a dropdown menu take 10 minutes to populate is a good way to be the butt of every performance joke from then on.
 
Nobody "knows" machine code. It's fucking hexadecimal values. You're not going to sit down and start writing a custom ROM for your embedded system using machine code and a hex editor.

Everyone spend a few hours to understand assembly and how it works, but never use it again.

The OP wants to code on PC, iOS, and Android. Don't throw away years of common library development to learn assembly. Even C is borderline obsolete on those platforms.

I've written plenty of custom ROMs in machine code over the years. In the embedded world, you'll probably have to do it at one point or another. And I've done TONS of assembly. The embedded world is different; you actually care about maximizing performance.

As for C, it's still the preferred language for driver development. As iOS still forces Objective-C for apps, which is basically C with a lot more idiot proofing built in.
 
And this is why you need to understand how the platform you are running on actually works:
That's an awesome link. I don't think someone new to programming really needs to go that deep, but people who simply use packaged libraries for data structures and haven't a clue of how heavy or efficient those data structures are aren't going to get too far with computer science. Maybe they don't want to (or need to) get far with computer science, so it's a decision they can prioritize ...
 
To the OP- brushing up on C is fundamental, and an excellent choice. If you're working with microcontrollers and embedded development, it's indispensable. No doubt, you're familiar with Arduinos and AVR micros in general- they have their drawbacks, but make for really convenient prototyping (sure, you could probably build the quirky widget you're looking for, or you could prototype your product with a cheap shield that saves you hours of hardware debugging). If you don't like the Arduino IDE and the dumbed-down adaptation of C that it uses, you can always program the AVR chips with AVR Studio - the way they were meant to be used.

More broadly, C has had a tremendous influence on modern languages. Many languages are now considered "C-like" in terms of syntax and control flow- https://en.wikipedia.org/wiki/List_of_C-family_programming_languages. Where we have resources like RAM and CPU to burn, the abstractions that have developed over time make life much, much easier. Using C# as an example- sure, you could develop GUIs and complex applications for Windows or Linux with C or C++, but in terms of programmer productivity, you're much likely to extract more from your time using abstractions like WPF.

If you're looking to maximize your productivity on a cross-platform basis, then there are a few options. You can't run Java apps on a iOS, and you can't run Objective C or Swift on Android. Because RAM and CPU power are relatively plentiful, and because JavaScript engines are quite mature, JavaScript is of paramount importance. Proprietary solutions (see: Xamarin for C#) let you build in one language and deploy (with minimal changes) for iOS, Android, and others, but JavaScript has some great open source tools that give you a native-like feel and cross-platform flexibility.

Basically, unless you're building something like a very intense 3D game, JavaScript is worth prioritizing. PhoneGap/Cordova are a mature toolchain, and Ionic Framework is a really cool suite of tools for packaging AngularJS apps into hybrid mobile applications. Even if you are building a game or something more performance-critical, the tools are there to do it in JavaScript- look at what a Microsoft team did with Babylon.JS- http://flightarcade.com/

The HTML5 spec gives you things like local SQLite access, browser & window local storage, camera & mic access, GPS/location, and offline app capability even without stuff like the Cordova plugins that wrap native device functionality. A vocal minority of developers have been saying for years that JavaScript is the future of development, and for most applications, I buy it. TypeScript builds on it as well, and is worth a look for more complicated usage.

JavaScript (even with serious optimization like what Asm.js provides) is still not optimal for the highest performance code- it's still a high-level language, and still gives no real control over threading. But increasingly, high-performance scientific and financial computing applications crunch the numbers server-side, and then spit data back to a JavaScript/HTML5 frontend.

In short: C's great, and not going anywhere on low-power embedded platforms where RAM is measured on the order of kilobytes. For everything else, there's JavaScript. Preferably with a framework like AngularJS or React.JS.
 
That's an awesome link. I don't think someone new to programming really needs to go that deep, but people who simply use packaged libraries for data structures and haven't a clue of how heavy or efficient those data structures are aren't going to get too far with computer science. Maybe they don't want to (or need to) get far with computer science, so it's a decision they can prioritize ...

There's a split among developers right now on things like algorithm & time/space complexity questions, and whether they're necessary. If you're applying for a job as a high frequency trader/quant developer, and will be depended on to produce extremely tight, performant code- this matters. If you're applying for a job developing machine learning algorithms, and your work involves sublinear algorithms- you'd better believe this matters. But, for the huge subset of people making good livings as mobile/frontend devs, or those pushing out simple web applications in a corporate environment? Not as important. Managers often still fire off the algo questions in interviews to weed out the non-comp sci majors, rather than asking questions about problem-solving.

I am a psych major who worked in finance for a many years, learned to program by necessity, started and ran a quant hedge fund, and now runs a tech startup. I've got a number of good offers at this point to work as a consultant dev (gotta keep the lights on) while building my own business, but I feel like half of the interviews I've been on have been more about trivia than practical programming. My point is that I've gone pretty far into the tech side of things, and am finding that a lot of the really wonkish stuff isn't that useful in a majority of the things I encounter. Meanwhile, comp sci curricula often skimp on things like jQuery, CSS, and workflow management that many would consider fundamental.
 
Meanwhile, comp sci curricula often skimp on things like jQuery, CSS, and workflow management that many would consider fundamental.

Rightly or wrongly, I suspect a lot of students go into CS wanting to broadly learn software engineering. I'm conflicted on the degree to which CS programs ought to be teaching vocational skills.
 
Rightly or wrongly, I suspect a lot of students go into CS wanting to broadly learn software engineering. I'm conflicted on the degree to which CS programs ought to be teaching vocational skills.

The money in JavaScript and mobile dev right now is out of control. Clearly, there's more of a need for these skills in the market.
 
Meanwhile, comp sci curricula often skimp on things like jQuery, CSS, and workflow management that many would consider fundamental.

That's because jQuery, CSS and workflow management have absolutely nothing to do computer science what-so-ever.

jQuery is hardly fundemental to client-side JavaScript, let alone computer science as a whole.

Computer science isn't about getting a job as a two-bit web developer. It's a huge umbrella, and having a class on jQuery in a computer science degree would be a lot like having a class on using screwdrivers in a physics degree.

The money in JavaScript and mobile dev right now is out of control. Clearly, there's more of a need for these skills in the market.

Right, and having someone who spent the last 40 years of their lives designing graph algorithms on a chalk board attempt to teach them how to use jQuery is not going to give them those skills. Computer science programs are a great place to learn theory and fundamentals. If you want practical skills, those are better achieved in practice.
 
Last edited:
The money in JavaScript and mobile dev right now is out of control. Clearly, there's more of a need for these skills in the market.

No disagreement there - I wonder though if a CS program is the right place to teach those skills.

Especially in an undergraduate degree, there's a limited amount of subjects one can study and including technical courses that teach skills that aren't really computer science is a hard sell.

There are a variety of programs that claim to prepare people for mobile and web development. I hear many are of questionable quality, but I think specialized programs like this, that can supplement or even substitute a traditional degree make a lot of sense - assuming they are done right.
 
That's because jQuery, CSS and workflow management have absolutely nothing to do computer science what-so-ever.

jQuery is hardly fundemental to client-side JavaScript, let alone computer science as a whole.

Computer science isn't about getting a job as a two-bit web developer. It's a huge umbrella, and having a class on jQuery in a computer science degree would be a lot like having a class on using screwdrivers in a physics degree.
To actually get things done in the real world, jQuery is a fairly inescapable standard- at some point, you'll have to port or wrap a library that relies on it, because it's ubiquitous. To say that it's not fundamental in 2015 is pretty bold.

As for the two-bit web devs- well, the people going through boot camps and writing shit code aren't getting the jobs that are worthwhile. Major companies are moving to web frontends for their applications. I got a ton of action (and a job) from this search here: http://www.indeed.com/jobs?q=angularjs+$140,000&l=New+York , and have more consulting opportunities than I know what to do with. You can't be professionally-useful with something like AngularJS without having a good understanding of jQuery.

Why can someone like me, someone who became a developer as a career backup plan/sideline to running another business only a few years ago, pull a $200k salary while also working on other stuff? Because web development is a pretty in-depth professional discipline that's in incredible demand, and it's not taught well in schools. Doing it well in a team environment requires a good knowledge of DevOps/workflow, numerous technologies, and yes- a certain amount of computer science knowledge, but I'd probably have run screaming from development in general if I had been forced to sit through a class on algorithms or compilers rather than coasting through business and psych classes.

Right, and having someone who spent the last 40 years of their lives designing graph algorithms on a chalk board attempt to teach them how to use jQuery is not going to give them those skills. Computer science programs are a great place to learn theory and fundamentals. If you want practical skills, those are better achieved in practice.
Your physicist/screwdriver analogy immediately made me think of engineers- physicists talk and do math, engineers build things. In reality, good developers need to be tinkerers, while I know physicists who aren't mechanically-inclined enough to tie their own shoes. Point is, engineers go to school for a reason, although the stars are already doing the practical stuff on their own. A structured course of study for the semi-informed tinkerers of the web/mobile dev world would produce real value and positive professional outcomes.

If you're formally studying comp sci because you love it, great. If you're doing it because you find it an interesting challenge, then your PhD will be something you treasure- if you're not getting one, reassess your motives. For everyone else, it's about the job and the salary.
 
Back
Top