Computer Science Students Should Learn to Cheat, Not Be Punished for It

Rather an interesting perspective, especially given some of the recent lawsuits which revolve around programmers doing precisely what the author is advocating...
 
I'm trying to remember my finals... most were the culmination of a large (or a couple large) programming projects that you've been working on for an entire semester. I also remember that some questions would list some output and you needed to write in the code to get there. Other finals - especially toward the end of the degree - were completely open book/notes/net and you could work with your friends. and nope, that didn't make it easier (kinda like multiple-choice ends up being more difficult than normal questions. Ever have a prof make the simple mistakes that people do... and then have them listed as an answer? yeeeaah). Actually, depending on the class - working with other people could be a tremendous waste of time. Was always taught by the profs that in the "Real world" you'll have access to reference manuals, co-workers, the internet etc. to help solve problems - though i don't think plagiarism ever came up - but not re-inventing the wheel certainly came up. that, and the fact that very likely you would end up being re-trained in a job to program in the way that company wants you to... and stuff being taught as the right way to program something, could be the exact opposite elsewhere. heh! fun times.

All I know, seldom does "copied code" ever work on something else without some significant modification (unless it's the exact same program?) - usually good examples, but almost never the solution?
 
I would have to agree with this at a glance. You shouldn't be learning to "program" with a 4 year Computer Science degree, you should be learning to design. It is things like data structures, design principles, test methodologies, and inheritance that separate computer scientists from programmers.

Frankly, stealing code snippets from the internet is optimization, not plagiarism. You still have to choose your snippets wisely, lest you simply get sent down a path with new, different problems. Putting so much emphasis on unique, syntactically correct code just divorces you from trying to understand the problem.
 
Rather an interesting perspective, especially given some of the recent lawsuits which revolve around programmers doing precisely what the author is advocating...


Law suits are in and of themselves, a revenue vehicle. All it really proves is that IT intellectual property is a field that always seems fresh for the plowing.

Take a good look at IT patents, they don't revolved around the content of the code. IT patents are written to protect the "idea".

"I patented a middle-out compression algorythm blah blah" (Yea, I've been binge watching Silicon Valley)

It's because as I said earlier, there's more then one way to skin a cat. If I don't patent the idea itself (the cat), what an app does, it's functionality, it's "soul", and I just patented the code itself, someone could steal the idea and re-engineer the code. A different way to skin the cat.

So you patent the cat, and then it's doesn't matter how you make the cat, it's still a cat. And I hold the patent on cats.

Of course if I have unique code written to make cats have sharp teeth and you use my code so that your dog also has sharp teeth, it doesn't mean I won't try and make some cash with a law suite.

As I've been told "It's just good business".
 
As long as what you're doing conforms with the software license, there should be zero concern. That's what fucking code licenses are for. Even if it's the BSD or GPL.
 
Like solving a math equation, points should be awarded for showing your work, not just the answer. If a student copies code, but is able to explain how that code accomplishes the task, he deserves at least some credit. If the class isn't being taught / tested / scored in a manner that allows for this, maybe that should change?
great in theory, but nobody has time to grade like that. And (unless this is all automated now...been a long time since I was in school). Personally, I think in college, you need to do your own work. That doesn't mean someone can't help with some section of code, but you're never going to be a decent developer if all you can do is copy someone else's code. Yes there's a lot of copy/paste, but there's a lot of actual coding...and I've worked with people who can't code. It's a PITA, because you have to do their work for them.
 
I teach programming in college. We proctor Exams to make sure the student's are capable of doing the coding. It's open book. Sure you can Google to fine solutions, but if you don't understand what you did then how can you pass an employee test to land a job? We talk to employers (BofA, Duke Power, etc.) every year to find out what they need in programmers and they always say that potential employees have to know their shit. Sure some training is given, but if you copy and paste in college or whenever they'll know.
 
I both agree and disagree with this... I definitely think it was worth learning to do it the "hard" way while in school. Learning the data structures, how the different sorts worked, etc. All the inner-workings of what was happening in those pre-built libraries (which we weren't allowed to use - had to create our own stacks/queues/lists/etc.). Even in our later classes we couldn't do this however...which I don't agree with (wasn't until I switched from C++ to C#, about 3/4 through my college career, until I actually learned on my own how to use non-standard or 3rd party libraries). But, it was worth it and now I understand what's going on and can troubleshoot performanced a bit more in understanding why something is happening in the back end.

We also has professors that would strongly discourage you (and straight up forbid) from working with other classmates - something I strongly disagreed with and worked on projects with friends anyway. We'd find an empty classroom after hours on campus, order a pizza, and figure out logic on the whiteboards in the classrooms together, and then start coding. If one of us ran into a problem, we'd help each other out through the logic or code issue and continue on together. Our code ended up looking really similar because our logic was basically the same (pretty sure the prof knew we were working together too), but we knew how it worked and understood the concepts so we passed the tests just fine and did well on the projects. Then other professors encouraged working together, but would make the tests slightly more difficult to make sure that you did indeed understand the concepts.

I have been through the academic rigor and when coding in verilog all my prof asked us to do was insert a descriptor saying where we got it and what it does. If the code did the job he didn't care. If it was messed up he barraided you.
Same for my research papers, lead with authors name and cite, then no ine cares and everyine got credit.
If your trying to pass shit off as your own to make yourself look like a genius.... Be prepared to fall hard.
I feel like we went to the same school lol. One of my classes (not a CS course, but required for CS students to take), we had to do verilog and the prof was very strict on reusing code you found anywhere, and basically automatically failed you if thought you took code from somewhere else without attributing the source of where you got it.

I think this seems similar to math proofs. You would work on math proofs during class / homework, but you had come up with the right proof at the right time during the test.

Perhaps testing for CS needs to change where you write code during the test without the aid of the Internets?
We had to write sample code on tests in my early CS courses... depending on the teacher it could just be pseudocode, but another required basically the correct syntax. I hated that because it took too much damn time to write out on paper and be sure you were correct. And if you made a mistake at the beginning of the code? Lol gotta erase a bunch or try and squeeze an additional line in there between 2 other handwritten lines of code.
 
This is a really tricky subject, first off a lot of people think that academics teaches people how to think. Personally I don't believe this to be true, it has been my experience that the real problem solvers and thinkers in life are VERY few and VERY far between. What we have instead are mostly people who are good at looking at a problem then recalling or finding a similar problem and then adapting the other solution to this slightly new or exactly the same problem and there is a wide range of people from very good to barely passable in this group. Actually coming up with real novel or even slightly novel solutions is a very rare event. Yet we all try to fool ourselves into thinking we are doing our own work every day, but really our brains are just matching an existing problem to one we saw solved in the past. The best solutions are very often ones where someone worked out a solution then it was improved upon many times till it just works very well and those are probably the solutions that you find in code dumps. Then others adapt from that. For this reason the OP really isn't that far off. The tricky part gets to be how much plagerism do you allow before you realize the students aren't really even capable of understanding the problem or even properly solving it at all. That's really all academics are typically trying to vet out and avoid.

Another thing that people should understand is that really novel solutions and thinking is not something that you can constrain to a class time, or a 2 hour test. That type of thinking to come up with new solutions can take a week or a month or even a year. Efficiency comes when the very few really novel problem solvers are encouraged to share their solutions. That's how the patent system is suppose to work.
 
This is both wrong and right at the same time. You need to be able to write original code from scratch but you also need to be able to take someone else's existing code and adapt or modify it. This is what the entire open source movement is largely based on -- and as long as you include the appropriate credits and GPL licenses, it isn't plagiarism. Personally, I think more schools should be giving classes that focus on be able to download the code from a publicly available open source GIT and then adapt it to do a slightly different task. 90% of my job involves making modifications to the Linux kernel, writing hardware specific drivers (often based on pre-existing drivers for other parts), or modifiying the boot loader. That said, I also occasionally have to write full drivers from scratch. So, this is not totally an "either/or" proposition.

One final comment I will make is with regards to "computer science" vs "computer engineering". CS is aimed much more at academia and often tends to teach techniques and methods that are totally unsuited for writing stable, deterministic code that makes sense in the "real world". Essentially, many CS programs tend to produce graduates that are suited for going into academia and not much else. CE programs tend to be much more "real world" focused and much more often produce people I'd be willing to hire in industry. Specifically, at a job I had many years ago, they had in place an unwritten but understood policy to typically not hire CS majors from the local university, simply because they had too many bad habits to "unlearn" and way too much in the way of theoretical backgrounds with no real world programming/debugging experience. There were exceptions to this -- one of the best programmers I know was one of them (JPV) -- but he was the exception rather than the rule.
 
Gotta wonder if this isn't a Computer Science vs. Software Engineering discussion.

It seems that if you're interested in working with existing conventions, including code, Software Engineering is the way to go- and this is really what most 'programming' jobs are, whereas if you're interested in solving new problems Computer Science is the way to go- based on my understanding of the two up to this point, specifically when it comes to universities that offer programs in both.

And if this is indeed the case, it would make sense that 'plagiarism', or really any unsanctioned code-reuse, might be a real problem in Computer Science education where adversely it would be a cornerstone of Software Engineering education.
 
Law suits are in and of themselves, a revenue vehicle. All it really proves is that IT intellectual property is a field that always seems fresh for the plowing.

Take a good look at IT patents, they don't revolved around the content of the code. IT patents are written to protect the "idea".

"I patented a middle-out compression algorythm blah blah" (Yea, I've been binge watching Silicon Valley)

It's because as I said earlier, there's more then one way to skin a cat. If I don't patent the idea itself (the cat), what an app does, it's functionality, it's "soul", and I just patented the code itself, someone could steal the idea and re-engineer the code. A different way to skin the cat.

So you patent the cat, and then it's doesn't matter how you make the cat, it's still a cat. And I hold the patent on cats.

Of course if I have unique code written to make cats have sharp teeth and you use my code so that your dog also has sharp teeth, it doesn't mean I won't try and make some cash with a law suite.

As I've been told "It's just good business".

I understand your point, but it should be patenting the act of skinning the cat. Not the method of skinning it. Just the act of skinning it period. Not patenting the cat, as you'll patent that specific cat. So if my cat is different, it'll skate by.
 
This is like the concept of you don't need to understand math as you have calculators. There still needs to be a level of underlying knowledge about what you are doing. When I was in college we always joked about the schools that only taught people how to basically copy and paste code. We were always told that we were learning how to learn how to program. Lots of classes used different programming languages, we even had a class that went over how all the different languages work to understand them at a base level. How is HTML different from C++? How is Java different than C? If you only teach somebody how to copy and paste somebody else's code they don't actually understand the basics, which also means that they don't know how to create new. That said, we did have a class where using whatever source (other than other students) was fine for getting information. The instructor's mind set was that once you hit the real world your boss is never going to ask you a question but tell you that you can't use anything as a reference for an answer, So knowing how to look up what you don't know is as important as what you know. However that class didn't require any coding for that part.

Only a quarter of developers I've known could even come up with a unique way of doing things, regardless of their coding capability. Of course, they're way ahead of the general population, where it's probably about 3-5%. Most human beings are only capable of putting legos together, not designing their own lego pieces.

I can agree with this. Having majored in computer science many people in my classes lacked any true understanding of things enough to ever be anything than a code monkey.
 
This is a really tricky subject, first off a lot of people think that academics teaches people how to think. Personally I don't believe this to be true,
It's not, I'm in academics, at a college level, and we're well beyond the point of teaching people how to think at this point they're pretty much hardwired already. We try to teach people the subject, it's up to them to use their current facilities to figure out what works for them in retaining knowledge of the subject whether it's for a test, or an assignment, or even so far as to recall where they saw the information so they can re-read it.

Actually coming up with real novel or even slightly novel solutions is a very rare event. Yet we all try to fool ourselves into thinking we are doing our own work every day, but really our brains are just matching an existing problem to one we saw solved in the past. The best solutions are very often ones where someone worked out a solution then it was improved upon many times till it just works very well and those are probably the solutions that you find in code dumps.
In academics you're right, you read about how to do something in a book or in a lecture and then you're expected to mimic what was taught in some way, in this case writing code. But there is a hill of differences between me writing a piece of code that say
creates 10 random numbers between 0 and 100 and then orders them from lowest to highest, and searching online to find some piece of code that already does it already. In the former case is a matter of me demonstrating that I can figure out how to do what was laid out in the assignment, the later case is me saying "well this is just something that needs to get done", but in either case I'm not going to be coming up with a "novel" solution because there's only so many ways to do something like that and coding has been out there the wild for a while and there's only so many ways to do a task like that, so you're bound to be doing what someone else did already, but that in itself is not plagiarism.

The tricky part gets to be how much plagerism do you allow before you realize the students aren't really even capable of understanding the problem or even properly solving it at all. That's really all academics are typically trying to vet out and avoid.
The problem is there seems to be a lot of confusion with what plagiarism is, even in this thread I see people trying to make parallels with copyright law which is pretty far off. Academics often wants to YOU to figure out how to do a particular task, and we realize that you will come to similar results as everyone else (who did the assignment correctly), will you look at other code to see how it was done and then write something similar? Sure why not, that's really no different than reading the book and looking at an example they did and trying to work that into your solution in some way.

But the "copy/paste" mentality is where plagiarism shows up, 2 students who do a math assignment you expect them to get similar answers, and depending upon the complexity of work involved to even show similar steps. But if you see everything parsed out the same way, and written in similar locations, and the big giveaway is making similar mistakes, that's not you trying to mimic what was taught from you that's you copying someone's work and passing it off as your own, which is what plagiarism is. Doing a problem like Newton did isn't plagiarism, it's the way to do that problem, but just making a copy of what was done is plagiarism.
 
He's wrong with a caveat.

There's three levels of programmers:

1. Computer Scientist (computer Science degree)
2. Computer programmer (usually an Arts degree or some fly by night school like ITT tech)
3. Hacker (home brewed and home taught)

Understanding critical algorithms and how they work make you a critical thinker and the best of the software engineers.

I see so many software programmers these use rolled up functionality like berkeley's STL's or .NET's generic list/tree/dictionary classes. And then they will wonder why their program, which randomly accesses large amounts of data, is running slower than crap. You ever try to block access large amounts of linear memory in a managed language like JAVA/.NET. It's a freaking nightmare of inefficiency. While managed languages & generic templates are very elegant and powerful, if you don't know their strengths and weaknesses, you can still write crap.

There are, last I remember, at least 25 different sort algorithms. Each with it's own strengths and weaknesses. (I always liked Heap sort personally with the database trees I use because it's easy to pack it into a linear array) Only someone who truly understands the theories behind each algorithm, and who has insight to use it is appropriately, is worthy of being an architect.

Seeing someone else's solution doesn't mean you mastered it in your head as to it's strengths and weaknesses. At best you'll only be a mediocre programmer. *Looks up on the shelf at his Knuth Fundamental algorithms, books 1-2-3*
 
This is like the concept of you don't need to understand math as you have calculators. There still needs to be a level of underlying knowledge about what you are doing. When I was in college we always joked about the schools that only taught people how to basically copy and paste code. We were always told that we were learning how to learn how to program. Lots of classes used different programming languages, we even had a class that went over how all the different languages work to understand them at a base level. How is HTML different from C++? How is Java different than C? If you only teach somebody how to copy and paste somebody else's code they don't actually understand the basics, which also means that they don't know how to create new. That said, we did have a class where using whatever source (other than other students) was fine for getting information. The instructor's mind set was that once you hit the real world your boss is never going to ask you a question but tell you that you can't use anything as a reference for an answer, So knowing how to look up what you don't know is as important as what you know. However that class didn't require any coding for that part.



I can agree with this. Having majored in computer science many people in my classes lacked any true understanding of things enough to ever be anything than a code monkey.


Agree somewhat, but teaching mathematics has other primary goals beyond solving for C. Studying math teaches / trains people how to think logically. It's a means to an end. The goal is much more than just being able to get the right number, it's more about learning how to think in ways that allow people to solve problems. Not just math problems, A+B=C, but real world down to earth problems.

Our city has a landfill and it's getting full .... fast, we need to take action or we will have real problems soon. What do we do?

We can build a new landfill, how much will it cost, how long will it take, how big does it need to be, how long will it last before we have to do this again?

And how much time do we have to come up with a solution and while we are at it, is a new landfill our best option or do we have better choices?

Real world problem solving, yes you have to some math, but all the math skills in the world are not enough if you can't think logically to begin with. Through math people can learn how to think logically.

So I need to know how to think logically, and I do to do some real math math. Now I can use a calculator to do the math, but the calculator doesn't help with the logical thinking so as a student one needs to show their work (y)
 
It's not, I'm in academics, at a college level, and we're well beyond the point of teaching people how to think at this point they're pretty much hardwired already.....

That's not what I would expect. This was awhile ago, but my last duty position in the Army was an Instructor for my Job Specialty, so I was training the new kids. I routinely was faced with students who had problems doing what seemed to me to be the simplest tasks. I was told by another instructor that I felt this way because I had such a highly developed grasp of the work through my experience so these things just seemed basic to me. I didn't see it that way. I truly looked to me that most of my students had not learned how to think logically at a basic level. Now some of you might imagine that a big part of this is , well .... "look at what you had to deal with", people who signed up for the Army vs university students, no comparison right?

Across the board you might have a point, but when a soldier enters the service they take a battery of tests and those tests are used to determine suitable jobs for the applicant. The soldiers selected for technical military intelligence positions had to score at the top levels. These kids weren't rocks, they had potential, but they didn't have any skills. I felt that they were lacking basic logical thinking skills, the kinds of skills that they should have mastered in high school solving problems about apples and triangles. It's so much easier to learn them in that context because when they get to me, it's not apples, it's Surface to Air Missile Launchers and their organic RADAR systems, all the funky names we give them, the characteristics of the RADAR systems, frequency ranges, etc, how they deploy in the defense or in support of an attack. It becomes vastly more complex.

So this was my experience back around 1997. I am sure things are different to some degree, perhaps today students leave HS better prepared and I may have been experiencing a low point in time. But to me, this just helps better define what we are saying here.
 
I can see both sides of the coin here. I think it boils down to differences in goals.

Most scientists/researchers/etc, in my experience, are concerned with finding the best way to do something, and then setting about to implement something to prove how well it actually works.

Most engineers/techs/operators, in my experience, are more concerned with getting something done in the first place, and then turning around and trying to find better ways to accomplish that.

There is merit to be had in re-inventing the wheel, but only so much. A smart person needs to understand both ways, and the difference between putting effort towards a new solution versus using an existing solution.
 
At undergrad studies anyway, it's pretty much acceptable, even encouraged: Just don't copy something word for word. I took lots of other people's reports, reworded them, and submitted them as my own and got A's for my effort. School grades are a terrific waste. All they do, is show a potential employer whether you're willing to work your ass off for someone's approval. If someone wants to hire you, they should be able to test your knowledge THEMSELVES. There are simply way too many people out there with lofty degrees who are functional imbeciles, who just parroted back what professors wanted, then quickly forgot everything about the material in the class they took. This is why experience trumps academic grades every single time.
 
Back
Top