Research and Advances
Computing Applications Contributed articles

Student and Faculty Attitudes and Beliefs About Computer Science

The curriculum should inspire students to view CS as both accomplishment and intellectual discipline.
Posted
  1. Introduction
  2. Discussion
  3. Results by Subcluster and Theme
  4. Conclusion
  5. Acknowledgements
  6. References
  7. Authors
  8. Footnotes
  9. Tables
  10. Sidebar: key insights
teacher gestures at projected image

What students think about a discipline—its structure, usefulness, how it is learned—plays an important role in shaping how they approach it. Just as faculty members aim to have their students learn the facts and skills of a discipline, they may also want to shape student beliefs and attitudes. Here, we report the attitudes of undergraduate computer science students early and late in the curriculum, comparing them with faculty attitudes in the same department. The results reflect the places where students think what faculty want them to think, where they do not think that way, and whether there is evidence that final-year students agree more or less with faculty than students in introductory courses. Together with earlier research, the results provide insight into sometimes surprising attitudes, and can help guide curricular improvement.

In physics education, research1,13 into key attitudes and beliefs about physics as a discipline and how they change suggests that courses sometimes shift student attitudes away from the attitudes endorsed by faculty. In particular, students may move toward the view that physics is mainly about memorizing and evaluating formulae, rather than about a conceptual understanding of the natural world.

CS faculty are also likewise concerned with student attitudes: “CS is just programming;”17 “As long as a program works it doesn’t matter how it is written;” and “Theoretical CS is not relevant to the real world.” Do students hold such views? As they move through the curriculum, do their beliefs come to resemble those of faculty teaching them?

Our study collected and analyzed data on these points. Specifically, we collected responses to 32 questions about attitudes and beliefs from beginning and advanced CS undergraduates and from faculty at the University of Colorado at Boulder. The results revealed some areas in which student responses clearly agree with faculty and others where they disagree. Comparing the responses of beginning students with those of more advanced students also suggests how progress through the curriculum changes student beliefs and whether such change is toward or away from what faculty believe.

We gathered 38 survey items from four sources: the Colorado Learning Analysis Survey for physics,1 selected on the grounds they were relevant (or relevant with small changes) to CS; an interview study of student attitudes toward group work in the CS curriculum16; faculty suggestions on student attitudes causing them concern; and Carol Dweck’s work7 on student attitudes about the roles of aptitude and effort in academic success.

We discussed the 38 items with a group of six volunteer students to determine whether they were easily understood. We discarded six items based on student input and changed the wording of others to make the intent of the survey statements clearer. The final survey consisted of 32 items; the table here includes the item numbers (non-consecutive) we used in coding the data and in analyzing the results.

Each of the 32 items asked faculty to indicate whether they strongly disagreed, disagreed, neither agreed nor disagreed, agreed, or strongly agreed. We gave students the same options but also asked them what they thought a CS professor would want them to say; for this response they were also allowed to indicate “don’t know.” The survey included a “catch” item (for both faculty and students) that should be left intentionally blank so responses from participants who simply filled in responses without reading the items could be discarded.

In the final week of the spring 2006 semester, we administered the survey by email to faculty and in paper form to students in three CS courses: first-semester introductory (CS1), second-semester introductory (CS2), and senior-level capstone design.

Back to Top

Discussion

We obtained responses from 13 faculty (of a total of 25). For student surveys, we received 71 surveys from CS1, 48 from CS2, and 41 from senior capstone. The survey was voluntary, though no more than one or two students in each class declined to participate. No surveys contained a response to the catch item, but we did reject one survey because the last three pages had identical responses for each item.

We tallied responses by grouping “strongly disagree” and “disagree” as negative responses, “strongly agree” and “agree” as positive responses, and all other responses, including omitted responses, as neutral. We examined the responses by faculty to classify the items as either rejected or endorsed by faculty. Using the criterion that 75% or more of faculty had to agree to reject or endorse an item, we excluded five items as not showing consensus among the faculty (see cluster 2 in the table).

We placed the remaining 27 items in thematic categories using a combination of what Adams et al.1 called “predeterminism” and “raw statistical” grouping. We first sorted them into groups reflecting our sense of the relationships among them, without reference to the data (predeterminism) and used hierarchical cluster analysis, a statistical technique, to identify items participants commonly responded to in the same way (using the SPSS 16 package2).

Before we performed cluster analysis, we transformed responses for items in the same thematic category so answers reflecting a related underlying attitude would be coded the same. For example, we transformed the responses to item 64 (“If you can do something you don’t need to understand it”) so a negative response would match a positive response to item 12 (“I am not satisfied until I understand why something works the way it does”).

We used the results of the cluster analysis to modify the groupings to bring the resulting categories in line with the data, where appropriate. That is, where the data showed the participants commonly answered two items the same way, we grouped these items together, even if they were not grouped together in our original classification. In other cases, where the data showed that two items we thought were related were actually commonly answered differently, we adjusted the grouping to reflect that fact.

The table shows the resulting groupings of items. At the highest level, they fall into two broad clusters: “CS as an accomplishment” and “CS as an intellectual discipline.” Within them, “subclusters” are groups of items that were statistically related, while “themes” were items with related content not strongly related in the data. For example, items 8 and 36 both relate to aspects of group work, so they are thematically related, but participants often gave different responses to them. In cluster 2, the items in the subcluster were closely related in the data; the “related” items formed a larger cluster around them and were less related to the “other” items in the cluster.

We marked each item in the table for which faculty shared consensus to show whether faculty rejected or endorsed the item. The percentage of students in CS1 and in CS2 and seniors (final-year students) who agreed with the faculty consensus is also shown. For most items, the percentage of CS1 and CS2 students endorsing the faculty position did not differ significantly; an asterisk next to the percentage for CS2 indicates the percentage is significantly different at the 5% level from the percentage for CS1 (according to a two-tailed Fisher’s exact test, a test for judging differences between groups when using small samples).

A face symbol indicates whether the seniors’ responses were one of the following: generally in line with faculty responses (75% or more of seniors endorsing the faculty position), marked by a happy face; mixed (between 50% and 75% of seniors endorsing the faculty position), marked by a neutral face; or substantially in conflict (less than 50% of seniors endorsing the faculty position), marked by a sad face.

The last column in the table (seniors vs. CS1) shows how senior responses compared to the responses of CS1 students. The numbers are the difference between the percentage of seniors and the percentage of CS1 students agreeing with the faculty consensus on each item. A negative number means seniors agreed with faculty less than CS1 students. Asterisks mark differences in responses significant at the 5% level (Fisher’s exact test, two-tailed).

Now consider an example of how the table reflects these aspects of the data. In the first data line (item 10), the entry shows (reading from left) that the faculty consensus rejected this item, and that 91% of CS1 students, 98% of CS2 students, and 95% of seniors also rejected the item, that the responses of the seniors were generally in line with the faculty consensus (the happy face means 75% or more agreement), and that the agreement between seniors and faculty (95%) was four percentage points greater than the agreement between CS1 students and faculty (91%).

Does greater agreement with faculty always represent goodness? No. We suggest later at least one item (4) on which we feel students were more right than faculty; readers might judge other items the same way. But faculty design the curriculum, and alignment with their attitudes is what they aim for. Failure to find alignment indicates a problem. Faculty weren’t of one mind, of course, and we removed items on which they didn’t agree, as described earlier.

In considering which items show evidence of progress or lack of progress as students move through the curriculum, we must urge caution in basing inferences on how students in the introductory classes compare with seniors; different students completed the survey in each of the classes. There were no doubt differences among the responses not traceable to having gone through more or less of the curriculum. Many surveyed CS1 students were not CS majors. A good many students join the CS program after CS1; of the senior class we surveyed, 45% joined after CS2 and 37% after CS1 and before CS2. Students also drop out of the program; about one-eighth of the surveyed CS1 students eventually ended up as CS majors. But the CS1 students reasonably represent the students available as input to the program. A finding that students late in the curriculum agree less with faculty could be due to many factors but likely shows that the output from the program is not improved with respect to the available input—a challenge to faculty charged with managing the program. The table does not present student responses about how faculty would like them to respond to each item. In some cases we bring this information into the discussion of individual items.

Among the 27 items for which we found strong faculty consensus, seniors were generally in line with faculty (75% or more agreement) on only seven items. For 16 items, we found mixed agreement (50%–75%), and for four items we found less than 50% of seniors endorsed the faculty position. Though this level of agreement was not good, it is better than for CS1 students. For 22 of the 27 items, we found seniors were more likely to agree with the faculty consensus than CS1 students. For seven of the items the difference in agreement was statistically significant. Among the five items on which seniors agreed less with the faculty consensus than CS1 students, the difference was statistically significant for one item (44).

Back to Top

Results by Subcluster and Theme

We now examine the data for the thematic groups of items and the subclusters to see what it says about student beliefs and how these beliefs differ between beginning and more advanced students.

Don’t learn just from examples. In this statistical cluster, seniors agreed with faculty that there is more to CS than memorizing solutions or learning a single best approach to problems. They agreed less strongly that reading is important, and it may be they value reading less than, say, CS1 students.

The end justifies the means. Most seniors agreed with faculty that how a program is written is important (item 52) and were significantly more likely to share this attitude than CS1 students. This is good news. But only 68% of seniors agreed with faculty that doing things right is more important than just getting a solution, though this response represents a significant improvement over the position of CS1 students. Less than half of the seniors felt that how they do the work on an assignment is more important than getting “the desired answer” (item 20). This is a little better, though not much, than the response from CS1 students. Most seniors (76%) correctly indicated that faculty would disagree with the item, so this was not a case of students not knowing where faculty stood on an issue.

Why do students not feel that how they complete an assignment is important? This attitude may connect with a student’s wish to increase the difficulty of assignments as a way to demonstrate competence. Leonardi et al.8 found this tendency in interviews with CS students, writing:

“We found no place in the data where informants suggested that ignoring instructions would help them arrive at a better solution. Rather, they admitted readily that following pre-specified instructions made the task easier. But ignoring instructions increased the challenge and introduced a higher level of risk. As one engineer observed, ‘if you can do it by figuring it out yourself instead of following some cookie-cutter process then you’re on your way to becoming an expert.’ Success under such conditions demonstrated expertise and technical competence.”

Responses on this item might reflect what students saw as the role of assignments. According to some research,5 students may consider work on assignments to be a product offered for payment (in the form of a grade) rather than as a learning experience. Viewed this way, doing the work a certain way to learn new techniques is irrelevant.

Work in the real world. While most seniors recognized that computer scientists don’t spend a lot of time working alone, a third did not. Most rejected the cartoon stereotype of the incompetent manager, though many did not. While there is room for improvement, results on these items were better than for CS1 students. Most seniors rejected item 18 (“In the real world, computer scientists’ work is mostly programming”), a matter important to many people concerned about public perceptions of the field.17 The situation was significantly better among the seniors than among CS1 students; indeed, the situation among CS2 students was also significantly better than among CS1 students, though because a good many students entered the program at CS2 this improvement cannot be attributed to what happens in the curriculum. This item also showed the greatest difference between seniors and CS1 students on the survey. Still, just less than a quarter of the seniors thought the work of computer scientists is mostly programming.

Group work. While students generally recognized that much work in computing is collaborative, and departments and programs strive to provide opportunities for students to develop collaboration skills, the results we found for the two items in the “group work” theme showed there is room for progress. More than a quarter of the seniors indicated that one can take more pride in individual work, and almost half felt that understanding must be developed alone.

Waite et al.16 and Leonardi et al.8 discussed negative student attitudes toward group work, reporting that many students form the misconception that individual work is the essential measure of competence for skilled professionals. Faculty did not show consensus in rejecting item 36 (“It’s a lot more satisfying to do an assignment on your own than with help”) and item 38 (“When you submit a group project, the instructor can’t tell how good your individual work was”), suggesting that faculty, as well as students, need to do some work in this area.

Other items in cluster 1. Items 6, 14, 28, and 56 might have been expected to appear in one of the table’s subclusters (perhaps in “concepts and understanding”) but did not show response patterns closely related to the items in these clusters. As a group, they showed only middling agreement between seniors and faculty, with agreement by seniors being somewhat greater than by CS1 students. For item 28 (“A significant problem in learning computer science is being able to memorize all the information I need to know”) and item 56 (“In the computer science curriculum, if you spend enough time coding you’ll get all you need to get from the courses”), seniors agreed with faculty significantly more than CS1 students agreed with faculty.

Less than half of the seniors rejected item 66 (“If you know what you are doing you can leave work to the last minute and still get it done”), confirming a problem identified by Waite et al.16 and Leonardi et al.8 that students see procrastination not as a failing but as something positive, a way to demonstrate personal prowess, writing:

“The important point here is that for the informants in this study, waiting until the last minute to begin a project was not a sign of laziness or disinterest in the subject matter. Rather, beginning an assignment late makes successfully completing the task more difficult, and, thus, is a sign of their expertise and mastery of technical skill. In the laboratories on days before large projects were due, informants regularly discussed the status of their projects with one another, comparing how much they had completed. Similarly, on days on which a large project was due, student engineers typically asked one another ‘When did you start?’ and ‘When did you finish?’ Higher status was awarded to those who could wait the longest and still complete the project successfully.”

Senior attitudes on this matter were hardly better than those of CS1 students and slightly worse than CS2 students; at least they weren’t much worse.

CS is creative and valuable. Responses to the three items in this sub-cluster were strongly related. Unfortunately, agreement between seniors and faculty was not strong for any of them. Worse, for two items, 50 (“The work you do in computer science in the real world requires a lot of creativity”) and 60 (“Reasoning skills used to understand computer science material can be helpful to me in understanding things in everyday life”), seniors agreed less with faculty than did CS1 students. For item 58 (“Research in computer science often develops really important ideas”), agreement with faculty was somewhat stronger among seniors than among CS1 students but at 66% did not constitute a ringing endorsement.


Only 68% of seniors agreed with faculty that doing things right is more important than just getting a solution, though this response represents a significant improvement over the position of CS1 students.


Related items: Concepts and understanding matter. As mentioned earlier, these items form a larger subcluster together with the subcluster just discussed. There was variation in the number of seniors endorsing the faculty consensus, pointing to the value of concepts and understanding. For two of the apparent bright spots, item 12 (“I am not satisfied until I understand why something works the way it does”) and item 64 (“If you can do something you don’t need to understand it”), agreement was good but hardly better among the seniors than among CS1 students. Only for item 46 (“If I get stuck on a computer science problem, there is no chance I’ll figure it out on my own”) was the greater agreement by seniors than by CS1 students statistically significant.

On the negative side, for two of the items in this group the seniors agreed less with faculty than did the CS1 students. For example, seniors were less likely to endorse item 44 (“When I solve a computer science problem, I explicitly think about which computer science ideas apply to the problem”) than were the CS1 students. Most seniors (88%) said faculty explicitly endorsed thinking about ideas, but they themselves didn’t endorse it. Why didn’t they?

Interviews reported by Leonardi et al.8 may shed light on this misalignment, identifying a “norm” among students that “expertise is measured by task difficulty” among the students aspiring to be engineers, writing:

“The norm suggests that engineers should place value on overcoming challenge and ‘beating the odds.’ The work practices reflecting this norm artificially and purposefully increased the difficulty of a given task, such as a homework assignment. Taken together, these practices introduced a sense of ‘sport’ to engineering work by providing handicaps that ultimately decreased an informant’s chances of success. Informants perceived that completing a task with a handicap was a mark of an ‘expert engineer.'”

Leonardi et al. also suggested that one way students increase the difficulty of assignments (so as to demonstrate their skill to themselves and sometimes to their peers) is to ignore concepts that would actually help with the work.

Other items in cluster 2. Like some of the “other” items in cluster 1, items 26 and 58 in this group might have been expected to appear in one of the subclusters but did not. They show only middling agreement between seniors and faculty, with agreement by seniors greater than by CS1 students. For item 48 (“There are times I solve a computer science problem more than one way to help my understanding”), seniors agreed with faculty significantly more than CS1 students agreed with faculty.

Item 4 (“Nearly everyone is capable of succeeding in the computer science curriculum if they work at it”) reflects an interesting situation. Faculty consensus rejects Dweck’s view7 that effort is the key to success, but most seniors do not reject this attitude, only a few more than among CS1 students. For someone agreeing with Dweck, it’s good that student views on the value of effort aren’t changed much. It’s also interesting that seniors wrongly believed faculty endorse Dweck’s position, with 88% of seniors indicating that faculty would want them to agree with the item.

The data further suggests that item 4 was not very strongly related to any of the other items in the survey. Despite falling in cluster 2 in the hierarchical-clustering results, it is the item in that cluster that is least closely related to the other items.

Top-level clusters. The hierarchical cluster analysis revealed two clear categories in the data, and a review of the items in each cluster showed them to be meaningful groupings. The groups suggest that students conceptualize CS in two distinct ways: The first is “CS as accomplishment,” in which the emphasis is on outcomes and what it takes to reach them, including skill, technical expertise, programming knowledge, and resources (books, peers, teachers). The second is “CS as intellectual discipline,” in which the emphasis is on how CS offers a way to approach and understand the world, including how to reason, gain understanding and deep learning, appreciate the importance of creativity, and dwell on problems to be able to explore them fully. This intellectual-discipline view is very much the perspective on the field emphasized by Wing17 in her work on computational thinking.


Faculty must consider ways to move students toward the idea that “The work you do in computer science in the real world requires a lot of creativity,” rather than away from it.


The fact that these two clusters emerged from the data is important. Interestingly, earlier research discussed a similar contrast between accomplishment and creativity in engineering.3,14,10 It is possible that the two perspectives—CS as accomplishment and CS as intellectual discipline—could be in tension with one another. How might they be reconciled or otherwise aligned?

We can revisit some of the data reviewed earlier and consider how it reflects on these perspectives. Seniors were in conflict with faculty on two items in cluster 1, and the responses from CS1 and CS2 students were similar. First, seniors believed that waiting until the last minute is acceptable if you have the know-how (item 66). Second, they believed that getting the desired result is more important than how you get there (item 20). These results directly confirm the findings of earlier research,8,15,16 highlighting the emphasis on accomplishment at the expense of other considerations that might be important to faculty or to effective learning.

In cluster 2 there was conflict with faculty on item 44 (“When I solve a computer science problem, I explicitly think about which computer science ideas apply to the problem”). Faculty and CS1 students agreed that they intentionally reflect on which CS ideas apply to the problem they are trying to solve. Less than half of seniors claimed to do so. This, too, supports the view of CS as competence, where skill is the application of knowledge, rather than a type of reasoning or discipline. This item (44) was the one with the greatest difference between CS1 students and seniors, in the negative direction.

The only other conflicting item is potentially troubling if we are concerned with access to the CS major. Faculty did not endorse the statement that anyone could succeed at CS if they worked at it (item 4); students in all groups consistently disagreed with faculty on this.

Looking at differences between seniors and CS1 students with respect to their agreement with faculty, with the exception of item 54, on the importance of reading, all items for which seniors agreed less with faculty than CS1 students were in cluster 2. Compared to CS1 students, fewer seniors believed “real-world” CS requires creativity (item 50); fewer believed that either the reasoning skills of CS (item 60) or its theoretical concepts (item 22) were relevant to everyday life, and (as we discussed), fewer still were intentionally reflective when solving CS problems (item 44).

Overall, the average agreement between seniors and faculty was 67% for cluster 1 and 63% for cluster 2, not very different. But the average increase in agreement with faculty, comparing seniors with CS1 students, was 16 percentage points for cluster 1 and only one percentage point for cluster 2. The results suggest the curriculum falls significantly short in helping students develop the perspective that CS is an intellectual discipline.

Back to Top

Conclusion

The survey results and analysis suggest a number of challenges to the curriculum in which the students and faculty participated. Using one item (50) as an illustration, faculty must consider ways to move students toward the idea that “The work you do in computer science in the real world requires a lot of creativity,” rather than away from it. A next step could be collecting longitudinal data from the same students as they move through the curriculum. Collecting data in key courses at the beginning and end of a semester would also be useful in separating the effects of selection from the effects of courses themselves and in zeroing in on the effectiveness of courses intended to promote particular attitudes and beliefs.

Besides being important learning targets in themselves, the attitudes and beliefs explored here may also be important in other ways. Studies in university physics education show that student attitudes and beliefs relate to performance on content assessments.1,13 Studies in physics education also show direct evidence of selection, rather than a change in attitude, as more advanced students are compared with beginners. This selection effect raises the possibility that understanding student attitudes and beliefs could be important in terms of retention and understanding why some groups are less well represented than others in CS programs. Although we did not collect data on gender, it is possible that attitudes and trends differ for male and female students, and that understanding them could help address underrepresentation of women in CS.

Development of attitudes and beliefs as learning goals is a key part of the process by which college education socializes students into their professions.4,11,12,18 Waite et al.15,16 presented curricular and pedagogical innovations pointing in the right direction on a number of issues (such as increasing student involvement during class and creating team-based assignments that require genuine collaboration rather than a “hands off” approach). For example, reducing the weight placed by faculty on assignment grades and encouraging collaboration can improve student attitudes toward assignments.15 Such innovation could make a big difference if adopted and reinforced throughout a curriculum. What kind of force is needed to make it happen?

“Curiosity” is one possible answer. If you are a CS faculty member, how would your students respond to the survey? Its results came from just one department, likely not yours. You might think your students would never deliberately make their work more difficult or that they are all aware of the value of CS research. But are you sure?

Curiosity is very important in fueling improvements in physics education, as faculty found their students did not respond as they would have wished on evaluation instruments shared across institutions (see, for example, Crouch and Mazur6). Data on this problem provided the foundation for efforts that produced measurable improvements. Can CS faculty achieve the same results?

Back to Top

Acknowledgements

An earlier report of these results appeared in 2007 in the ACM SIGCSE Bulletin.9 We thank Amer Diwan, Christian Doerr, Noah Finkelstein, Gary Nutt, Steven Pollock, Bruce Sanders, and Brian Shuckey for ideas, inspiration, and assistance. This project was an activity of the President’s Teaching and Learning Collaborative (http://www.colorado.edu/ptsp/ptlc/) of the University of Colorado.

Back to Top

Back to Top

Back to Top

Back to Top

Tables

UT1 Table. Survey results

Back to Top

    1. Adams, W.K., Perkins, K.K., Podolefsky, N., Dubson, M., Finkelstein, N.D., and Wieman, C.E. New instrument for measuring student beliefs about physics and learning physics: The Colorado Learning Attitudes about Science Survey. Physical Review Special Topics: Physics Education Research 2, 1 (2006).

    2. Anderberg, M.R. Cluster Analysis for Applications. Academic Press, New York, 1973.

    3. Bucciarelli, L.L. Designing and learning: A disjunction in contexts. Design Studies 24, 3 (May 2003), 295–311.

    4. Bucciarelli, L.L. and Kuhn, S. Engineering education and engineering practice: Improving the fit. In Between Craft and Science: Technical Work in U.S. Settings, S.R. Barley and J.E. Orr, Eds. ILR Press, Ithaca, NY, 1997, 210–229.

    5. Button, G. and Sharrock. W. Project work: The organisation of collaborative design and development in software engineering. Computer Supported Cooperative Work 5, 4 (Dec. 1996), 369–386.

    6. Crouch, C.H. and Mazur, E. Peer instruction: 10 years of experience and results. American Journal of Physics 69, 9 (Sept. 2001), 970–977.

    7. Dweck, C.J. Self-Theories: Their Role in Motivation, Personality and Development. Psychology Press, Philadelphia, PA, 2000.

    8. Leonardi, P.M., Jackson, M.H., and Diwan, A. The enactment-externalization dialectic: Rationalization and the persistence of counterproductive practices in student engineering. Academy of Management Journal 52, 2 (Apr. 2009), 400–420.

    9. Lewis, C. Attitudes and beliefs about computer science among students and faculty. SIGCSE Bulletin 39, 2 (June 2007), 37–41.

    10. Margolis, J. and Fisher, A. Geek mythology. Bulletin of Science, Technology, and Society 23, 1 (Feb. 2003): 17–20.

    11. National Academy of Engineering. The Engineer of 2020: Visions of Engineering in the New Century. National Academies Press, Washington, D.C., 2004.

    12. Ondrack, D.A. Socialization in professional schools: A comparative study. Administrative Science Quarterly 20, 1 (Mar. 1975), 97–103.

    13. Perkins, K.K., Adams, W.K., Finkelstein, N.D., Pollock, S.J., and Wieman, C.E. Correlating student beliefs with student learning using the Colorado Learning Attitudes About Science Survey. In Proceedings of the Physics Education Research Conference (Sacramento, CA, Aug. 5). AIP, Melville, NY, 2004, 61–64.

    14. Vincenti, W.G. What Engineers Know and How They Know It: Analytical Studies from Aeronautical History. Johns Hopkins University Press, Baltimore, MD, 1990.

    15. Waite, W.M., Jarrahian, A., Jackson, M.H., and Diwan, A. Design and implementation of a modern compiler course. In Proceedings of the 11th Annual SIGCSE Conference on Innovation and Technology in Computer Science Education (Bologna, Italy, June 26–28). ACM Press, New York, 2006, 18–22.

    16. Waite, W.M., Jackson, M.H., Diwan, A., and Leonardi, P.M. Student culture vs. group work in computer science. SIGCSE Bulletin 36, 1 (Mar. 2004), 12–16.

    17. Wing, J. Computational thinking. Commun. ACM 43, 3 (Mar. 2006), 33–35.

    18. Yurtseven, H.O. How does the image of engineering affect student recruitment and retention? A perspective from the USA. Global Journal of Engineering Education 6, 1 (2002), 17–23.

    DOI: http://doi.acm.org/10.1145/1735223.1735244

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More