Want to relearn C, what book?

To actually get things done in the real world, jQuery is a fairly inescapable standard- at some point, you'll have to port or wrap a library that relies on it, because it's ubiquitous. To say that it's not fundamental in 2015 is pretty bold.

Libraries can use jQuery all they want internally without a developer needing to 'know jQuery'. Of course, most libraries, at least the ones where the authors actually want people to use their stuff, don't use jQuery or at minimum don't pull in full jQuery, because jQuery is both large and largely unnecessary. Angular uses jQuery's API for DOM interactions, but by default it uses a stripped down version due to jQuery's weight.

jQuery served a purpose in the last decade. It provided an 'easy' way to do most things in a 'cross-browser' fashion, since it used to be a pain in the ass to do things like get the text content of an HTML node in cross browser code. Luckily, both the standards and the currently used browsers have advanced heavily, and doing things in vanilla JavaScript is not hard. ES6 has fetch, so I don't need jQuery for AJAX, ES6 has a promise spec that isn't idiotic, so I don't need jQuery Deferreds, and for anybody using shadow DOM or shady DOM, jQuery's DOM features become largely neutered and useless. Any expressive methods jQuery offers, like extend, are better provided by something like lodash.

In 2008, jQuery was fairly fundamental. In 2015, the world doesn't really need it anymore.

As for the two-bit web devs- well, the people going through boot camps and writing shit code aren't getting the jobs that are worthwhile.

That's because the boot camps focus on teaching things like jQuery and CSS (probably in a 'use bootstrap for all of your CSS' manner, too), instead of teaching a good computer science foundation. You've just pointed out the exact reason why CS programs should not be teaching jQuery and CSS...The last thing we need is for computer science programs to be MORE like those bootcamps.

I'd rather hire someone who understands that the DOM and CSSOM are trees, and the implications that has on manipulating and rendering in the browser who hasn't ever touched jQuery than someone who doesn't really understand trees but can barf out tons of jQuery spaghetti.

Major companies are moving to web frontends for their applications. I got a ton of action (and a job) from this search here: http://www.indeed.com/jobs?q=angularjs+$140,000&l=New+York , and have more consulting opportunities than I know what to do with. You can't be professionally-useful with something like AngularJS without having a good understanding of jQuery.

So do a lot of people. There's a bottomless pit of opportunities cleaning up the messes people who don't understand what's going on under the hood can create in entirely client-side single page web applications built with a framework like Angular that runs a synchronous digest loop with ditry-checking based two way data-binding. If someone is going to be writing a large scale AngularJS application, whether or not they know how to use jQuery is the least of my concerns.

Anybody who's been around web development through the shift to client-side frameworks knows the kind of trouble people got into with Backbone and Angular because they weren't paying attention to what was really going on when they were using features of those frameworks. They're great, powerful frameworks, but only if you have a good understanding of their inner mechanics and what's going on under the hood. People who skipped out on learning about algorithms and compilers in pursuit of learning jQuery and CSS are not going to have an easy time learning how their platforms actually work, and that spells trouble with a capital T.


but I'd probably have run screaming from development in general if I had been forced to sit through a class on algorithms or compilers rather than coasting through business and psych classes.

Which is probably a problem of delivery rather than potential for practical application. Too much of the post-secondary education world doesn't do a good job of encouraging students to learn. Algorithms, compilers, and computer architecture are probably the three classes that will set you apart as a web application developer, yet these are the things often cited by so-called web developers as 'useless' or 'impractical'. It really is a sad state of affairs, because these are the tools of true engineering.

To bring this back around to the idea of whether or not people should learn C, even though they're probably going to go off and get a job writing C# or JavaScript or TypeScript and 'never touch C': C forces you to get into the habit of caring what's happening lower down. You develop the ability to figure out what a sort function you're trying to use is doing under the hood, what its performance characteristics may be, and what implications it could have on your larger application. Writing fast C code teaches you to learn whether or not a cache miss or a branch mispredict will matter for you. It teaches you to be able to make back-of-the-napkin predictions about how much time something might take to run, or whether or not a given approach will work for an input size of a certain order of magnitude. Even when a library or system API is handling something for you, you often wind up being mindful of what it's actually doing. Maybe not down to exact line numbers, but you at least try to figure out how it works.

This same skill turns out to be very important for web developers. A good JavaScript developer should be able to learn how traversing scope chains and prototype chains to resolve variables can impact their application's performance, how the garbage collector works, how memory leaks in JavaScript can occur, why sparse arrays are so expensive to work with, etc. A good JavaScript developer needs to be able to understand what the libraries and frameworks they're using are actually doing if they're going to write fast, maintainable, scalable applications. A lot of these skills come naturally to someone with a good background in computer science and with experience working closer to the machine. The JavaScript runtime is a 'virtual' machine after all, and while it's much more abstract than an i386, the ability to intimately understand what it's doing is of the same nature.
 
Last edited:
Dogs- I think that we're in agreement on a lot of this, but coming from different angles. To be effective with AngularJS (especially when building directives or fixing something you found from GitHub), you need to know enough jQuery to be dangerous- Angular has a dependency on jqLite (an included subset of jQuery), but will use any other version of jQuery you include in an application before Angular. I work on big, complex web applications that depend on a lot of libraries for a lot of different things, and it's often not feasible to spend the time porting them to native AngularJS. It's been incredibly helpful to understand how the two interact. But yes, shadow/virtual DOM is the future.

I see a LOT of bad JavaScript and badly-written Angular applications. I've flipped through friends' design patterns textbooks and course notes, and have seen a lot of things that wouldn't really help people build modern web applications well. You've obviously had some exposure to Angular, so you probably get what I mean when I say that hanging everything off of $scope and not using services effectively makes me cringe.

Intuitively and practically understanding memory usage/garbage collection and algorithms work is important. I find that in practice, it's less relevant to the way it's taught/delivered- I may not care about the time complexity of an algorithm in the strict Big O terms beaten into people in the classroom, but when I see multiply nested loops in other people's code, it's usually a good time to figure out a better way to handle things.

You're arguing on behalf of the fundamentals, which I support to the extent that it's practical. However- I used to run a small quant hedge fund, and employed a number of developers. We used mainly C#, Python, and C++. I've had a dozen or so employees and interns over the years with advanced degrees in comp sci and applied math from a school consistently ranked in the top 10 U.S. programs in both who have been functionally useless, or who are using tech stacks that are just not relevant in the real world. Now that I'm in a different field, I find similar problems.

I'm not saying that there's no place for comp sci programs as they exist now, but at some point, disciplines branch out where practical and theoretical concerns diverge. That's why you have applied & pure math, physics & engineering, et cetera. An "applied computer science" degree would be commercially valuable- you can't learn much of value in a 6 week boot camp, but you could spend 4 years studying things that people actually use and graduate with a way higher degree of preparedness.
 
I'm not saying that there's no place for comp sci programs as they exist now, but at some point, disciplines branch out where practical and theoretical concerns diverge. That's why you have applied & pure math, physics & engineering, et cetera. An "applied computer science" degree would be commercially valuable- you can't learn much of value in a 6 week boot camp, but you could spend 4 years studying things that people actually use and graduate with a way higher degree of preparedness.

And I'm saying if you want those students to come out of school with lots of bad habits and misconceptions ingrained in them, having an 'applied computer science' degree that teaches them how to misuse jQuery and Angular is a fantastic idea. If your goal is to have them be prepared for work, then that would be completely counter-productive. You'll end up with students who have less theoretical background, and an added penalty of thinking all of the things their school told them were practical are actually practical.

I don't know if you've looked at what they teach in computer science degrees, but a lot of schools already try to add curriculum to promote practical skills, and those parts of the degrees are fundamentally bad. We don't need developers to have more of that. What we need is to get rid of this stupid expectation that computer science classes are supposed to magically make you a software engineer, and start encouraging people to get their actual practical skills through actual practice. Instead of wedging in stupid shit, like 'Web Application Architecture' classes that teach people how to build stateful server-side routes in PHP and pretending that's somehow an acceptable way to write web applications in 2015, maybe universities should spend more effort trying to build relationships with good companies that can place students in internships and co-ops, and focus on making the theoretical parts of their curriculum more interesting to students and more easily applied when the student leaves the classroom.
 
Last edited:
Title says it all, learned C probably 15 yrs ago and never got a chance to really use it, so, well, can not program in C. Had an old book but figured there are so many technology differences between then and now, with all the different platforms, etc. I may as relearn with a strong core from this century.

My goals for your suggestions:
- Relearn good core C programming principles. I have a feeling a lot of this will come back to me, but I would like to make sure I will be guided correctly.

- I use multiple platforms, so will be programming for PC and also iOS, Android and other embedded systems where I have designed the hardware and would like to write my own code for it.

Please advise,
Bob

I recommend books written by Tony Gaddis.
 
Back
Top