A medical doctor is expected to stay informed about best practices in treating her patients. If a doctor were to use a treatment from a hundred years ago, when there exists ample evidence that the treatment is inappropriate or even harmful, the doctor would be considered unprofessional. The doctor might even be sued for malpractice.
I teach in Atlanta, where teachers have been found guilty of cheating on their students' standardized tests. The teachers have been tried by the Georgia Professional Standards Commission. The teachers didn't commit a crime. They were fired because of a lack of professional standards.
I've never heard of credentialing for higher education faculty, so there are probably not similar commissions judging a professor's professionalism. But that doesn't mean that we don't have similar responsibilities as other professions, like medical doctors. As ACM members, we're bound by a code of ethics, but I don't think it speaks to being an educator. Are there similar professional and legal responsibilities for us as computing educators, e.g., to stay informed about best practices, and use those best practices in how we teach?
For example, we have a significant problem retaining students through the first course in computer science. Many introductory computer science courses are lecture-based. We've known that lecture isn't a great method of teaching for decades. There are better models that improve retention in CS classes, even when using lecture, like adopting more engaging curricula or using methods like pair-programming. Some of these changes might be too expensive to implement, but others, like peer instruction in CS has a strong research base supporting its benefits and can be adopted with no additional cost.
It seems crazy to suggest that someone might be unprofessional for simply lecturing in an introductory computer science course. But how is it different than the medical doctor example? Consider other areas of computing. If a software developer used antiquated methods which didn't produce as well as other methods, that developer would be asked to learn and use new methods, or be fired.
Retention is a desirable goal, but it's not a required goal. How about diversity? Computer science has an enormous diversity problem. If there were methods that we could use in our classes to improve our recruitment and retention of women and under-represented minorities (and I believe there are), are we unprofessional for not using them?
Stronger still, is there a legal requirement to use them? Title IX is a section of law in the United States that states: "No person in the United States shall, on the basis of sex, be excluded from participation in, be denied the benefits of, or be subjected to discrimination under any education program or activity receiving Federal financial assistance." Almost all higher education institutions in the United States receive some Federal financial assistance. Title IX has required many institutions to develop a stronger female athletics program (and provide scholarships), to balance the opportunities for males in sports like football and basketball.
Does Title IX require US insitutions to measure gender diversity in our educational programs, and if we're not getting women involved, to change our programs? A computing degree has significant economic benefits. If our current teaching methods and curricula are not successfully recruiting and retaining women, are we "denying the benefits" to them of our programs?
I wonder how other various protections for students extend into the on-line world. FERPA protects information about students in US institutions. For example, US teachers are required to keep confidential student grades, which limits how we can implement peer grading. Students are not supposed to know other students' grades. When US institutions offer Massive Open On-line Courses (MOOCs), are they still limited by FERPA? What are the implications for peer grading in MOOCs?
I'm not a lawyer nor an expert in professional ethics. I'm a computing educator. I see new practice being invented in conferences like the ones sponsored by SIGCSE, but relatively few of those methods make it into classrooms broadly -- even the ones with significant research evidence that they are broadly effective. That's what's led me to wonder: If we know a better way to teach computing, are we professionally (and even legally) required to use it?
I'm someone who learns best from a lecture followed by practical use of the knowledge. Tell me about an algorithm, and then have me develop a program at a later time that implements it in an assignment or a lab. I really dislike interactive lectures (e.g. learning quick sort by sorting 7 students at the front of the class by height, filling out question sheets about the lecture during the lecture, etc.); two way interaction breaks my focus on absorbing the material. I retain far less when I have to keep switching from taking in information to interacting and back again. I guess that's one of the problems with education for the masses, everyone learns differently and you can't suit everyone's learning style.
As for the "diversity problem", no one has ever been able to satisfactorily explain to me why a disparity in genders/races is a bad thing. Is it really a problem if there are no people saying, "I'd like to do Comp Sci but I can't because of X" (where X is some problem faced by one gender/race and not the others)? I've never heard of sexist/racist policies among North American universities in recent times.
An interesting challenge is how to optimize learning -- doing what works best for most people, while meeting individual needs. The evidence that interactive learner-engagement in lectures works best for most people is VERY strong, e.g., Richard Hake has a 6,000 student study showing that interactive lectures work more effectively for learning. Now, how do we help individuals who might learn differently than the norm?
I've tried to make the argument for diversity here before (e.g., http://cacm.acm.org/blogs/blog-cacm/149681-us-women-in-computing-why-isnt-it-getting-better/fulltext). One argument is that computing needs more *people*, and as long as we only draw from white/Asian males, we are limiting our labor pool. Another argument is for improved quality of design by drawing in a greater diversity of perspectives and experience. Of course, the disparity is a problem to women and under-represented minorities, who don't get access to the high-paying jobs in computing.
I do believe that there is research showing that there exist X factors who inhibit some (especially women and under-represented minorities) from pursuing computing. I particularly recommend the work of Jane Margolis, Caroline Simard, and Betsy DiSalvo in this regard, who point out that bias does exist, even if not intentional.
Here's an example of the kind of unconscious bias that still exists in Universities: http://www.nytimes.com/2012/09/25/science/bias-persists-against-women-of-science-a-study-says.html
Displaying all 3 comments