#24: 6/11/22 Program #5 Graded |
The TAs/Readers have run the rubric that I gave them for Program #5 (observing
the behavior of your simulations, and looking at your code for inheritance)
and the grades are now recorded.
See the
assignment grades and
Grades(zipped .xlsm file) files, whose
details are discussed below, in Announcement #7.
The class average was about 87% and the median was 92%, meaning that most
students correctly solved most problems, and over half (65%) of the class
correctly solved all the problems (or had minor deductions).
Overall there were 65% As, 17% Bs, 6% Cs, and 9% Ds and Fs.
About 43% of the students submitted early, and these early submitters scored much better (average 95%) than students submitting on the due day (average 80%); I am assuming that some students ran out of time before they finished all the parts. In the assignment spreadsheet, Column A contains the Hashed IDs of all students (in sorted order); Column B contains an X if we believe the student submitted work on time; Column C shows who graded your program; Column D shows the extra credit points for early submissions. Row 2 shows the worth (in number of points) for each part of the problem. Rows 4-5 shows further information about the tests performed in each column. Rows 6 and beyond show credit awarded for each student: a blank cell means full credit; X means no credit; and P means partial credit, which means substantially correct but missing something important. Each of these marks should have a comment attached to it with the TAs brief description of the problem. Column S shows each student's cumulative score, for all the parts in the single problem in this assignment. Columns T-V show each student's cumulative Score, the score Rounded to an integer (what integer is entered in the Grades spreadsheet) and Percent, based on the number of points the assginment is worth (here 50). Note the following instructions were in the assignment: When you define these classes, you should not modify any code written in their base classes. You should also not access by name or duplicate any instance variables (attributes) that are inherited. You can call methods in the bases classes to access/update their instance variables and override any methods in the base clases (whose bodies might call the overridden methods to help them accomplish their task). You can also define new methods. Each leaf class must define or inherit an update and display method, which respectively implement the behavior of objects in that class and the image that they display on the canvas. For the last three columns, my grading instructions to the TAs were as follows. Although subjective, the averages for all the programs the TAs graded were within a few points of each other, so the grading was consistent.
This assignment was designed to illustrate the mechanics of Python inheritance, and its use to minimize code (including using multiple inheritance, once). It also illustrated the Model-View-Controller (MVC) way of writing Graphical User Interface (GUI) applications, with the student supplying code related to the Model only, and me supplying code for the View and Controller for its animation. |
#23: 6/5/22 Quiz #8 Graded |
The Readers have graded the code and document submissions for Quiz #8.
The grades are now recorded.
Recall that this quiz was ony 15 points, not the normal 25.
See the
assignment grades and
Grades(zipped .xlsm file) files, whose
details are discussed below, in Announcement #7.
The class average was about 85% and the median was 87%, meaning that most students correctly solved most problems, and most (47%) of the class correctly solved all the problems (or had minor deductions). Overall there were 47% As, 26% Bs, 10% Cs, and 17% Ds and Fs. In the assignment spreadsheet, Column A contains the Hashed IDs of all students (in sorted order) and Column B contains an X if we believe the student submitted work on time; Column C shows the extra credit points for early submissions Row 1 for Columns D-G shows how many points the problems were worth. Rows 4 and beyond show the number of points earned by each student.. Columns H-J show each student's cumulative Score, the score Rounded to an integer (what integer is entered in the Grades spreadsheet) and Percent, based on the number of points the assginment is worth (here 15). The Readers graded the following problems. 1a: Hyuho Oh (half: average 3.71 pts) 1a: Aaron Winter (half: average 3.68 pts) 1b: Jakub Antosz (all) 2a: Valeria Maya (half) 2b: Elizabeth Wing Yun Lui (all)The spreadsheets have some comments explaining point deductions: read the comments in the cells. Email the appropriate Reader about any grading issues. Here is a quick overview of the general rubric (see my answers online).
This assignment was designed to provide you with an opportunity to use both performance tools on a small scale, in the context of problems related to analyzing algorithms and code. It is easy to scale up and measure arbitrarily complicated code. As with all assignments, you should examine my solutions. |
#22: 5/29/22 Quiz #7 Graded |
The Readers have graded the Gradescope submissions for Quiz #7.
The grades are now recorded.
See the
assignment grades and
Grades(zipped .xlsm file) files, whose
details are discussed below, in Announcement #7.
The class average was about 75% and the median was 80%, meaning that some
students correctly solved most problems, but less than half (28%) of the class
correctly solved all the problems (or had minor deductions).
Overall there were 28% As, 28% Bs, 11% Cs, and 33% Ds and Fs.
In the assignment spreadsheet, Column A contains the Hashed IDs of all students (in sorted order); Column B contains an X if we believe the student submitted work on time; Column C shows the extra credit points for early submissions Row 1 in Columns D-I shows how many points the problems were worth. Rows 3 and beyond show the number of points earned by each student. Columns J-L show each student's cumulative Score, the score Rounded to an integer (what integer is entered in the Grades spreadsheet) and Percent, based on the number of points the assginment is worth (here 25). The TAs/Readers graded the following problems. Problem 1: Rahima Problem 2: Elizabeth Problem 3: Kyuho Problem 4: Aaron (left problem) and Valeria (right problem) Problem 5: Jakub Problem 6: HainingYou can see your score and the rubrics on Gradescope. I will be doing all the regrades. If you think the rubric was applied incorrectly (see my solution too) you can ask for a regrade on Gradescope. But, before you ask for a regrade, contact the ta/reader who graded that problem first, to determine whether it was graded correctly. For Quiz #6 a few students didn't contact the ta/reader and lost points for not following the correct protocol. If you have a question about the rubric, ask it on Ed Discussion: you may do so anonymously. Students might have a hard time understanding their grade on part 2a: I wanted to see the calcualtion set up as it was in the notes, and simplification occur (cancellation) to get to the answer; using big-O notation was not appropriate in the calculation. Likewise, for part 3c many students wrote about N as if it was the argument to the function, instead of N being the size of the argument: to get full credit students needed to correctly describe the argument. Students should examine their graded work immediately and get any regrade issues settled as soon as possible (within a week of when the grade is assigned). This assignment was designed to provide you with an introduction to solving problems related to complexity classes and analyzing algorithms and code. All these topics may be tested again on the Final exam (and frequently come up in job interviews, exploring a students depth of understanding about the non-coding aspects of programming). As with all assignments, you should examine my solutions.
|
#21: 5/26/22 In-Lab Programming Exam #3 Graded |
I have run the automatic batch self-check tests for In-Lab Exam #3 and the
grades are now recorded.
See the
assignment grades and
Grades(zipped .xlsm file) files, whose
details are discussed below, in Announcement #7.
The actual file I used for grading is
bsc.
You can find your solutions (by your ID Hashed), my solution, and the
bscile3S22.txt file that I used to compute grades for this
assignment in Ed Resources for this class
(see the name ile3materials.zip).
About 2 dozen students had timeouts. So scores for these students do not appear in the spreadsheets. If you are one of these students, talk to your TA about regrading. I computed your scores in two ways, recording the larger of the two numbers in column V in the spreadsheet.
The raw class average was 56% and the median was 52%. At the extreme, 11% of the students scored 100% -or more (because of the extra credit point; all required methods passed all batch self-check tests) and 53% scored less than 50% (solved no problems fully correctly). ,1--- The skew between the average and median statistics shows that although many students did well, there were some students who did very poorly, dragging down the average but not the median. ---> Because the raw average was below 75%, there are about 9.7 normalization points for this testing instrument (raising averages by about 19%). The approximate distribution of grades on this In-Lab exam (after normalization) is 47% As, 0% Bs, 3% Cs, 7% Ds, and 42% Fs; last quarter the scores were 43% As, 20% Bs, 10% Cs, 0% Ds, and 22% Fs; Fall quarter they were 49% As, 9% Bs, 2% Cs, 2% Ds, and 38% Fs. In terms of the problems: about 20% solved problem# 1 fully, about 25% solved problem# 2 fully, and about 32% solved problem# 3 fully. As another comparison this quarter about 59% solved no problems fully correct, about 26% solved one problem fully correct, about 10% solved two problems fully correct, and about 11% solved all three problems fully correct. This U-shaped normalized distribution (89% As and Fs) is common for In-Lab Programming Exams, where we are testing competency/mastery of programming concepts: the ones who attained it scored As (could do most everything in the allotted time); the ones who have not attained it scored Fs (solved no problems in the allotted time); only about 10% (which was actually lower this quarter than normal) of the students scored somewhere in-between FYI, the normalized averages for the different lab times were: 74% for students in Labs 1, 2, and 3 (meeting at 9am), 74% for students in Labs 4, 5, and 6 (meeting at 11am), and 77% for students in Labs 7, 8, and 9 (meeting at 1pm), and 75% for students in Lab 10 (meeting at 3pm). Lab 1 (at 9am) had the highest average (85%) and Lab 3 (also at 9am) had the lowest average (61%). Students should talk to the TA for their Lab first, if they do not understand why they received the marks they did or dispute any of these marks. The best time to talk with your TA/Reader about grades is during one of your Labs, when both student and TA/Reader are physically present to examine the submission and the grade, possibly running the solution on a computer they can share. Students should examine their graded work immediately and get any regrade issues settled as soon as possible (within a week of when the grade is assigned). IMPORTANT Information about Student Grades
ANALYSIS: Certainly it was a hard exam (I don't think any of my exams are easy), but 47% scored an A after nomalization. I felt that the problems that I asked were similar to those on quizzes 4, 5, and 6 and on last quarter's In-Lab Exam #3. Of course, similarity is in the eye of the beholder: generally "better" programmers will find more similarities. Of course, memorizing the solution code from the quizzes will do little good; understanding the code is the goal that enables writing similar code. So, in total, students completely solved about 26% (47% last quarter) of all the problems on the exam. Unlike the previous two exams, it was harder to hack your way to a solution for these problems: you had to understand better the problem and what you were doing. If we look at the current course grades, there are 53% As, 20% Bs, 13% Cs, 4% Ds, and 10% Fs (although some of those students have dropped the course). Last quarter the grades at this point were 57% As, 17% Bs, 8% Cs, 6% Ds, and 12% Fs Fall quarter they were. 43% As, 27% Bs, 11% Cs, 4% Ds, and 15% Fs. The percentages of As and Bs was 73%, 74%, and 70%: so similar Finally, last quarter the final grades were 47% As, 22% Bs, 11% Cs, 5% Ds, and 15% Fs. So, expect a similar down-shifting after the 200 point final exam. If you want to discuss general grading issues (not your how your specific submission was graded), I suggest posting on Ed Discussion. It is OK to post anonymously to your classmates, for grading issues. |
#20: 5/26/22 Program #4 Graded |
I have run the automatic batch self-check tests for Program #4 and the grades
are now recorded.
See the
assignment grades and
Grades(zipped .xlsm file) files, whose
details are discussed below, in Announcement #7.
The class average was about 100% and the median was 102%, meaning that most
students correctly solved most problems, and well over half (95%) of the class
correctly solved all the problems (or had minor deductions).
Note that this problem had an extra credit part, as well as extra credit for
an early submission.
Overall there were 95% As, 3% Bs, 1% Cs, and 1% Ds and Fs.
About 41% of the students submitted early, and these early submitters scored
slightly
better (average 104%) than students submitting on the due day
(average 97%); I am assuming that some students ran out of time before they
finished all the problems, and will plan to get started earlier on later
program.
In the assignment spreadsheet, Column A contains the Hashed IDs of all students (in sorted order); Column B contains an X if we believe the student submitted work on time (for pairs, only the submitting student will show an X, not their partner); Column C shows the extra credit points for early submissions Row 2 shows the worth (in number of points) for each problem. Row 3 shows the number of tests performed for each problem: all were batch-self check tests. Rows 4-5 shows further information about the tests performed in each column. Rows 6 and beyond show the number of failed tests for each student (a blank indicates no failed tests: equivalent to 0 failed tests). To compute the number of points for a problem/in a column, compute the percentage of successful tests and multiply it by the number of points the problem is worth. So for example, if a student missed 1 of 4 tests on a 4 point problem, he/she would receive 3/4*4 = 3 points. Column P shows each student's cumulative score, for all the tests in the single problem in this assignment. Columns Q-S show each student's cumulative Score, the score Rounded to an integer (what integer is entered in the Grades spreadsheet) and Percent, based on the number of points the assginment is worth (here 50). Note that these columns are filled in both for submitters and their partners (these are the only columns filled in for partners): a partner should see his/her submitter's line for details. To get the extra credit point for processing string annotations (Column O), your code must pass all 6 tests. Students should talk to the TA for their Lab first, if they do not understand why they received the marks they did or dispute any of these marks. The best time to talk with your TA about grades is during one of your Labs, when both student and TA are physically present to examine the submission and the grade, possibly running the solution on a computer they can share. Students should examine their graded work immediately and get any regrade issues settled as soon as possible (within a week of when the grade is assigned). IMPORTANT Information about Student Grades
This assignment was designed to illustrate the mutual recursion used for checking annotations (parameter and return) for functions, by overloading the __call__ method in a class, creating a function decorator. It provided some introspection code (examining function headers) that you needed to use to write your decorator. As with all assignments, you should examine my solution. |
#19: 5/23/22 Quiz #6 Graded |
I have run the automatic batch self-check tests for Quiz #6 and the TAs/Readers
have graded the paper submissions.
I used the following batch self-check files
(similar to the ones I provided, but with some altered/additional tests).
The grades are now recorded.
See the
assignment grades and
Grades(zipped .xlsm file) files, whose
details are discussed below, in Announcement #7.
The class average was about 81% and the median was 86%, meaning that most
students correctly solved most problems, and almost half (41%) of the class
correctly solved all the problems (or had minor deductions).
Overall, there were 41% As, 27% Bs, 9% Cs, and 23% Ds and Fs.
About 47% of the students submitted early (to Checkmate), and these early
submitters scored much better (average of 91%) than students submitting on
the due day (average of 72%); I am assuming that some students ran out of
time before they finished all the problems, and will plan to get started
earlier on later quizzes.
In the assignment spreadsheet, Column A contains the Hashed IDs of all students (in sorted order) and Column B contains an X if we believe the student submitted work on time; Column C shows the extra credit points for early submissions: ins some cases I "fixed" student programs by finding/removing bad imports (a 2 point deduction), so this column will show a negative number. Row 1 for Columns D-K shows how many points the problems were worth. Row 2 shows the number of tests performed for each problem. Rows 4 and beyond show the number of failed tests for each student (a blank indicates no failed tests: equivalent to 0 failed tests). To compute the number of points for a problem/in a column, compute the percentage of successful tests and multiply it by the number of points the problem is worth. So for example, if a student missed 5 of 20 tests on a 4 point problem, he/she would receive 15/20*4 = 3 points. Columns L-Q show the cumulative score for each Problem. Columns R-T show each student's cumulative Score, the score Rounded to an integer (what integer is entered in the Grades spreadsheet) and Percent, based on the number of points the assginment is worth (here 25). You can see your score and the rubric from questions 1-2 on Gradescope. If you think the rubric was applied incorrectly (see my solution too), I will soon open up these problems for regrading on Gradescope. If you have a question about the ruberic, ask it on Ed Discussion. Students should talk to their TA, if they do not understand why they received the marks they did. The best time to talk with any TA about grades is during one of their Labs, when both student and TA/Reader are physically present to examine the submission and the grade, possibly running the solution on a computer they can share. Students should examine their graded work immediately and get any regrade issues settled as soon as possible (within a week of when the grade is assigned). IMPORTANT Information about Student Grades
|
#18: 5/13/22 In-Lab Programming Exam #2 Graded |
I have run the automatic batch self-check tests for In-Lab Exam #2 and the
grades are now recorded.
See the
assignment grades and
Grades(zipped .xlsm file) files, whose
details are discussed below, in Announcement #7.
The actual file I used for grading is
bsc.
You can find your solutions (by your ID Hashed), my solution, and the
bscile2S22.txt file that I used to compute grades for this
assignment on Ed Resources: see the name ile2materials.zip.
Each of the 7 parts were worth 100/7 (~14.3) points: 3 points for getting it completely correct and ~11.3 points for passing all the individual tests. So for example, if a student missed 5 of 20 tests on such a 14.3 point problem, they would receive 0 + 15/20*11.3 ~ 8.5 points. In this grading system, missing 1 test would result in 0 + 19/20*11.3 points or 75% on that test. If you scored better on In-Lab Exam #2 than on In-Lab Exam #1, your score for this exam will be highlighted in yellow, and the sum of these two scores (column X in the grades spreadsheet) will be (.25*In-Lab1 + .75*In-Lab2) + In_Lab2. So, if you scored 50 on In-Lab #1 and 80 on In-Lab #2, your total is (.25*50 + .75*80) + 80 = (12.5 + 60) + 80 = 72.5 + 80 = 152.5 which is rounded to 153, for an In-Lab average of 76.5%. Without this weighting of In-Lab #1, the average would be 65%. The class average was about 89% and the median was about 100%. The skew between these statistics shows that although the majority of students did very well, there were some students who did very poorly, dragging down the average but not the median. At the extreme, 57% of the students scored 100% or more (because of the extra credit points; all required methods passed all batch self-check tests) and 7% students scored below 60%. A total of 34% of the students who took this exam received one extra credit point and 5% received both extra credit points. The approximate distribution of grades on this In-Lab exam is 64% As, 14% Bs, 6% Cs, 9% Ds, and 6% Fs. This U-shaped distribution (80% As and Ds/Fs) is common for In-Lab Programming Exams, where we are testing competency/mastery of programming concepts: the ones who attained it scored As (could do everything in the allotted time); the ones who have not attained it scored Fs (solved just a few problems in the allotted time); only about 20% of the students scored somewhere in-between (spread out evenly). FYI, the averages for the different lab times were 89% for students in Labs 1, 2, and 3 (meeting at 9am), 89% for students in Labs 4, 5, and 6 (meeting at 11am), and 89% for students in Labs 7, 8, and 9 (meeting at 1pm), and 92% for students in Lab 10 (meeting at 3pm). Lab 1 (at 9am) had the highest average (96%) and Lab 2 (also at 9am) had the lowest average (83%). Students should talk to the TA for their Lab first, if they do not understand why they received the marks they did or dispute any of these marks. The best time to talk with your TA about grades is during one of your Labs, when both student and TA are physically present to examine the submission and the grade, possibly running the solution on a computer they can share. Students should examine their graded work immediately and get any regrade issues settled as soon as possible (within a week of when the grade is assigned). IMPORTANT Information about Student Grades
|
#17: 5/11/22 Program #3 Graded |
I have run the automatic batch self-check tests for Program #3 and the grades
are now recorded.
See the
assignment grades and
Grades(zipped .xlsm file) files, whose
details are discussed below, in Announcement #7.
I used a different bsc test file for
grading: one that defined a pnamedtuple('Quad1', 'x y z f'), having a
different number of fields with different names not in alphabetical order.
The class average was about 100% and the median was 104%, meaning that most
students correctly solved most problems, and over half (90%) of the class
correctly solved all the problems (or had minor deductions).
Note that this problem had an extra credit part, as well as the standard extra
credit for an early submission.
Overall there were 90% As, 2% Bs, 3% Cs, and 5% Ds and Fs.
About 48% of the students submitted early, and these early submitters scored
better (average of 104%) than students submitting on the due day (average
of 95%); I am assuming that some students ran out of time before they
finished all the problems, and will plan to get started earlier on later
program.
In the assignment spreadsheet, Column A contains the Hashed IDs of all students (in sorted order); Column B contains an X if we believe the student submitted work on time (for pairs, only the submitting student will show an X, not their partner); Column C shows the extra credit points for early submissions. Row 2 shows the number of points each group of batch-self checks is worth; row 3 shows the number of tests performed for each problem: all were batch-self check tests. Rows 4-5 shows further information about the tests performed in each column. Rows 6 and beyond show the number of failed tests for each student (a blank indicates no failed tests: equivalent to 0 failed tests). To compute the number of points for a problem/in a column, compute the percentage of successful tests and multiply it by the number of points the problem is worth. So for example, if a student missed 1 of 4 tests on a 4 point problem, he/she would receive 3/4*4 = 3 points. Column L shows each student's cumulative score, for all the tests in the single problem in this assignment. Columns M-O show each student's cumulative Score, the score Rounded to an integer (what integer is entered in the Grades spreadsheet) and Percent, based on the number of points the assginment is worth (here 50). Note that these columns are filled in both for submitters and their partners (these are the only columns filled in for partners): a partner should see his/her submitter's line for details. Students should talk to the TA for their Lab first, if they do not understand why they received the marks they did or dispute any of these marks. The best time to talk with your TA about grades is during one of your Labs, when both student and TA are physically present to examine the submission and the grade, possibly running the solution on a computer they can share. Students should examine their graded work immediately and get any regrade issues settled as soon as possible (within a week of when the grade is assigned). IMPORTANT Information about Student Grades
This assignment was designed to illustrate the richness of ways to solve programming problems: writing a program that automatically writes a class, given the required information to specify it (class name and fields). It also provided an opportunity to improve your string-processing abilities. As with all assignments, you should examine my solution. |
#16: 5/9/22 Quiz #5 Graded |
I have run the automatic batch self-check tests for Quiz #5
and the TAs/Readers have checked all of the solutions for appropriate
use of recursion (as discussed in functional programming), to solve each
problem.
I used the following batch self-check file
(similar to the one I provided, but with some altered/additional tests).
The grades are now recorded.
See the
assignment grades and
Grades(zipped .xlsm file) files, whose
details are discussed below, in Announcement #7.
The class average was about 94% and the median was 100%, meaning that most
students correctly solved most problems, and over about half (67%) of the
class correctly solved all the problems (or had minor deductions).
Overall there were 67% As, 22% Bs, 6% Cs, and 5% Ds and Fs.
About 62% of the students submitted early, and these early submitters scored
much better (average of 99%) than students submitting on the
due day (average of 86%); I am assuming that some students ran out of time
before they finished all the problems, and will plan to get started earlier
on later quizzes.
In the assignment spreadsheet, Column A contains the Hashed IDs of all students (in sorted order) and Column B contains an X if we believe the student submitted work on time; Column C shows the extra credit points for early submissions. Row 1 for Columns D-H shows how many points the problems were worth. Row 2 shows the number of tests performed for each problem Rows 4 and beyond show the number of failed tests for each student (a blank indicates no failed tests: equivalent to 0 failed tests). To compute the number of points for a problem/in a column, compute the percentage of successful tests and multiply it by the number of points the problem is worth. So for example, if a student missed 5 of 20 tests on a 4 point problem, he/she would receive 15/20*4 = 3 points. A -1 in colums I-M means that the student did not solve the problem according to the requirements (see the comments there for more information). In such cases, not only was this point lost, but 1/2 the correctness points as well. Columns N-P show each student's cumulative Score, the score Rounded to an integer (what integer is entered in the Grades spreadsheet) and Percent, based on the number of points the assginment is worth (here 25). The Readers who graded columns I-M for these problems:
Students should examine their graded work immediately and get any regrade issues settled as soon as possible (within a week of when the grade is assigned). IMPORTANT Information about Student Grades
|
#15: 5/2/22 Midterm Graded |
The TAs/Readers and I have graded and recorded the scores for the midterm exam.
The TAs will distribute the graded midterms in their labs this week.
If you do not pick up your exam then, you will have to come to my office hours
to retrieve it (and I would prefer not to have hundreds of exams stockpiled
in my office).
See the
assignment grades and
Grades(zipped .xlsm file) files.
If you took the exam but do not show a score in the spreadsheets, please contact me ASAP. Sometimes the TAs enter a score on the wrong line (the scores are hand-entered), but since we have all the original exams that problem is easy to rectify. The class average was about 57% and the median was the same. Of course, because the average was below 75%, about about 18 normalization points (18%) will be added when computing the average of all graded instruments on the spreadsheet. The grades recorded in the spreadsheet (both in Columns S and T) are the actual exam grades (without normalization points; see cell S8, highlighted in yellow, for the actual number of normalization points that will boost your score). So your actual score, for computing your average in column AJ is the sum of your recorded score + all normalization points. These extra points are added into cell AB (your final average); AA includes only a sum of the points you received, without normalization points. After normalizing the scores on the midterm, overall there were 19% As, 20% Bs, 22% Cs, 21% Ds, and 18% Fs; last Winter there were 29% As, 19% Bs, 13% Cs, 20% Ds, and 19% Fs; last Fall there were 23% As, 19% Bs, 21% Cs, 19% Ds, and 18% Fs; last Spring there were 22% As, 22% Bs, 19% Cs, 20% Ds, and 17% Fs. Here is a list of the normalized averages for the various pages. Recall that although the exam had 110 points, 10 points were counted as extra credit, so the final averages were computed out of only 100 points.
Now is a good time to look at course grades as well, as we have graded nearly half of the total number of testing instruments (410 of 1,000 points). Now is the first time that recorded grades are truly meaningful, because they include testing instruments in all the major categories: quizzes, programs, in-lab exams, and written exams. The approximate distribution of course grades (for those students who submitted a midterm exam) is 52% As, 23% Bs, 10% Cs, and 15% Ds and Fs (in Winter there were 57% As, 22% Bs, 11% Cs, and 10% Ds and Fs; in Fall there were: 44% As, 23% Bs, 13% Cs, and 20% Ds and Fs; last Spring there were: 52% As, 26% Bs, 11% Cs, and 11% Ds and Fs) numbers much better than my original prediction of of 25% in each of these four categories (e.g., we have 75% As and Bs instead of 50% As and Bs). Note that final grades for students finishing ICS-33 last quarter were shifted a bit lower than the midterm grades: 53% As, 25% Bs, 14% Cs, and 8% Ds and Fs. Here is a list of who graded which problems.
I would like to thank the TAs/Readers for their efforts over the weekend. Each spent about 12 hours preparing to grade and grading their problems (and then entering the grades onto the spreadsheet) on over 350 exams; they will spend even more time finishing all regrading. If you have any issues with how any exam problem was graded, talk to the staff member who graded it, and they can discuss the rubric with you and resolve any issues. But first, please examine my solution and understand the differences between it and your answer. Students should examine their graded work immediately and get any regrade issues settled as soon as possible. You can see the TAs in their labs on Tuesday this week: on 5/5. I'm going to ask the Readers to visit labs on this day too: I'll announce their schedule in an email soon. Because we examined code (unlike for the In-Lab exam) partially credit was awarded, but sometimes points were deducted not for correctness issues, but for stylistic issues: e.g., using non-optimal views (e.g. .keys(), .values() and .items() for dictionaries); using unnecesssary data structures, loops, ifs, dictionary accesses; not using boolean values simply; not unpacking appropriately; not using the 9 important functions when useful, etc. This happened most frequently from problems on pages 1 and 4. Important: As with the In-Lab Exams, if a student performs better on the Final Exam (since it is cumulative), I will increase their Midterm Exam score to be 75% of the Final normalized score + 25% of the Midterm normalized score. Students showing a red cell in column AC-AD have a computed grade of C or above, but have not met the requirement that the average of either their In-Lab OR Written exams is at least 72.5%: these students will received a C- grade if one of these averages does not improve. Students often complain about this policy, and I have investigated doing away with it, by making the In-Lab and Written Exam worth more points. When I recomputed my spreadsheet, all the red-cell student would still end up scoring a C- or below, but many other students would also receive lower grades. So, by keeping the current system, the same students would not "pass" but other students will have hig20%her grades. That is why I still use the current policy. Finally, the normalized average for the 9am labs 73%, the 11am labs was 74%, the 1pm labs was 77%, and the 3pm lab (there was just one) was 76%. If I would group every student into a random lab time, the averages would still show similar variation: 74%, 74%, 77, and 73%. The normalized averages for the individual labs were Lab 1: 78%, Lab 2: 68%, Lab 3: 74%, Lab 4: 74%, Lab 5: 76%, Lab 6: 71%, Lab 7: 80%, Lab 8: 76% Lab 9: 75%, and Lab 10: 76%. So, the relative scores differed among different labs at the same time as well, with the highest scoring lab scoring 80% (at 1pm) and the lowest scoring lab scoring 68% (at 9am). |
#14: 5/2/22 Quiz #4 Graded |
I have run the automatic batch self-check tests for Quiz #4 and the grades are
now recorded.
I used the following batch self-check file
(similar to the one I provided, but with some altered/additional tests).
See the assignment grades and Grades(zipped .xlsm file) files, whose details are discussed below, in Announcement #7. The class average was about 95% and the median was 100%, meaning that most students correctly solved most problems, and more than half (87%) of the class correctly solved all the problems (or had minor deductions). Overall there were 87% As, 5% Bs, 2% Cs, and 6% Ds and Fs. Some students scored 0 because their code timed-out on one of the problems; see the information below about rectifying this issue. About 47% of the students submitted early, and these early submitters scored better (103% average) than students submitting on the due day (89%); I am assuming that some students ran out of time before they finished all the problems, and will plan to get started earlier on later quizzes. In the assignment spreadsheet, Column A contains the Hashed IDs of all students (in sorted order) and Column B contains an X if we believe the student submitted work on time; Column C shows the extra credit points for early submissions. Row 1 for Columns D-K shows how many points the problems were worth. Row 2 shows the number of tests performed for each problem. Row 3 shows the part of the problems in more detail. Rows 4 and beyond show the number of failed tests for each student (a blank indicates no failed tests: equivalent to 0 failed tests). To compute the number of points for a problem/in a column, compute the percentage of successful tests and multiply it by the number of points the problem is worth. So for example, if a student missed 5 of 20 tests on a 4 point problem, he/she would receive 15/20*4 = 3 points. Columns L-M show the cumulative score for each Problem. Columns N-P show each student's cumulative Score, the score Rounded to an integer (what integer is entered in the Grades spreadsheet) and Percent, based on the number of points the assginment is worth (here 25). Students should talk to the TA for their Lab first, if they do not understand why they received the marks they did or dispute any of these marks. The best time to talk with your TA about grades is during one of your Labs, when both student and TA are physically present to examine the submission and the grade, possibly running the solution on a computer they can share. Students should examine their graded work immediately and get any regrade issues settled as soon as possible (within a week of when the grade is assigned). IMPORTANT Information about Student Grades
|
#13: 4/27/22 Program #2 Graded |
I have run the automatic batch self-check tests for Program #2 and the grades
are now recorded.
I used the following batch self-check files
(similar to the ones I provided, but with some altered/additional tests).
See the
assignment grades and
Grades(zipped .xlsm file) files, whose
details are discussed below, in Announcement #7.
The class average was about 98% and the median was 102%, meaning that most
students correctly solved most problems, and way over half (91%) of the class
correctly solved all the problems (or had minor deductions).
Overall there were 91% As, 3% Bs, 1% Cs, and 5% Ds and Fs.
About 68% of the students submitted early, and these early submitters scored
a bit better (103% average) than students submitting on the due day (94%
average); I am assuming that some students ran out of time before they
finished all the problems, and will plan to get started earlier on later
programs.
Note: The second problem, the DictList class, was worth 30 of the 50 total points for this assignment. In the assignment spreadsheet, Column A contains the Hashed IDs of all students (in sorted order); Column B contains an X if we believe the student submitted work on time (for pairs, only the submitting student will show an X, not their partner); Column C shows the extra credit points for early submissions. Row 2 for Columns D-AA shows how many points the problems were worth. Row 3 shows the number of tests performed for each problem: all were batch-self check tests. Row 4 shows further information about the tests performed in each column. Rows 5 and beyond show the number of failed tests for each student (a blank indicates no failed tests: equivalent to 0 failed tests). To compute the number of points for a problem/in a column, compute the percentage of successful tests and multiply it by the number of points the problem is worth. So for example, if a student missed 1 of 4 tests on a 4 point problem, he/she would receive 3/4*4 = 3 points. Columns AG-AH show each student's cumulative score, for all the tests in each of the two problems in this assignment. Columns AI-AK show each student's cumulative Score, the score Rounded to an integer (what integer is entered in the Grades spreadsheet) and Percent, based on the number of points the assginment is worth (here 50). Note that these columns are filled in both for submitters and their partners (these are the only columns filled in for partners): a partner should see his/her submitter's line for details. Students should talk to the TA for their Lab first, if they do not understand why they received the marks they did or dispute any of these marks. The best time to talk with your TA about grades is during one of your Labs, when both student and TA are physically present to examine the submission and the grade, possibly running the solution on a computer they can share. Students should examine their graded work immediately and get any regrade issues settled as soon as possible (within a week of when the grade is assigned). IMPORTANT Information about Student Grades
This assignment was designed to provide you with a good grounding in the use of classes and the practice of overloading operators in classes, including a bit of writing iterators. Quiz #4 covers decorators for iterators using generator functions in much more detail. All these topics will be tested again on the Midterm and In-Lab Exam #2. As with all assignments, you should examine my solutions. |
#12: 4/27/22 Quiz #3 Graded |
I have run the automatic batch self-check tests for Quiz #3 and the grades are
now recorded.
I used the following batch self-check files
(similar to the ones I provided, but with some altered/additional tests).
See the
assignment grades and
Grades(zipped .xlsm file) files, whose
details are discussed below, in Announcement #7.
The class average was about 95% and the median was 100%, meaning that most
students correctly solved most problems, and well over half (74%) of the class
correctly solved all the problems (or had minor deductions).
Overall there were 74% As, 17% Bs, 3% Cs, and 6% Ds and Fs.
About 57% of the students submitted early, and these early submitters scored
much a better (101% average) than students submitting on the due day (87%
average); I am assuming that some students ran out of time before they
finished all the problems, and will plan to get started earlier on later
quizzes.
There were a few students whose code timed-out when I graded it; I let the grading program run everyone's code for 30 seconds. If you code timed-out, talk to your TA about replacing the body of any offending code by just pass, so that it won't time-out, allowing all the other code to be graded. In the assignment spreadsheet, Column A contains the Hashed IDs of all students (in sorted order) and Column B contains an X if we believe the student submitted work on time; Column C shows the extra credit points for early submissions. Row 1 for Columns D-N shows how many points the problems were worth. Row 2 shows the number of tests performed for each problem. Rows 4 and beyond show the number of failed tests for each student (a blank indicates no failed tests: equivalent to 0 failed tests). To compute the number of points for a problem/in a column, compute the percentage of successful tests and multiply it by the number of points the problem is worth. So for example, if a student missed 5 of 20 tests on a 4 point problem, he/she would receive 15/20*4 = 3 points. Columns O-P show the cumulative score for each Problem. Columns Q-S show each student's cumulative Score, the score Rounded to an integer (what integer is entered in the Grades spreadsheet) and Percent, based on the number of points the assginment is worth (here 25). Students should talk to the TA for their Lab first, if they do not understand why they received the marks they did or dispute any of these marks. The best time to talk with your TA about grades is during one of your Labs, when both student and TA are physicall present to examine the submission and the grade, possibly running the solution on a computer they can share. Students should examine their graded work immediately and get any regrade issues settled as soon as possible (within a week of when the grade is assigned). IMPORTANT Information about Student Grades
In the Date class, some students wrote operators that returned strings instead of Date objects; a few bsc tests fail in such cases, but many succeeded because the e test calls the str function on the left argument; it is supposed to call str on a Date object, but if you return a string, it will call str on it, which returns the same string. |
#11: 4/18/22 Program #1 Graded |
I have run the automatic batch self-check tests for Program #1 and the grades
are now recorded.
I used the same batch self-check files that I provided for this assignment.
See the assignment grades and Grades(zipped .xlsm file) files, whose details are discussed below, in Announcement #7. Note that columns AK-AP show information about the submissions for each student: what days submissions occurred (AK-AM), the total number of submissions (AN), how many submissions were graded and counted (AO), and how many were graded but not counted (AP). Of 262 students/pairs submitting, 27 (10%) had one or more submissions not counted: they submitted more than one on the due date, and/or more than two during the last two days. Of these, 16 (6%) had more than one submission not counted. The class average was about 90% and the median was 102%, meaning that many students correctly solved most problems, and more than half (75%) of the class correctly solved all the problems (or had minor deductions). Overall there were 77% As, 7% Bs, 2% Cs, and 14% Ds and Fs. FYI, last Spring quarter, there were 75% As, 8% Bs, 3% Cs, and 14% Ds and Fs. About 41% of the students submitted early, and these early submitters scored a bit better (by less than 1/2 grade) (98% average) than students submitting any programs on the due day (94% average). I am assuming that some students ran out of time before they finished all the problems, and will plan to get started earlier on later programs. In the assignment spreadsheet, Column A contains the Hashed IDs of all students (in sorted order); Column B contains an X if we believe the student submitted work on time (for pairs, only the submitting student will show an X, not their partner); Column C shows the extra credit points for early submissions. If column B has a red triangle, hover over it to read its message; sometimes you have to right-click the comment, then select "Edit Comment" to enlarge the comment box to read all of it. Row 2 for Columns AB-AA shows how many points the problems were worth (Columns W-AA record whether a reasonable/executable script was written for the parts of this programming assignment). All prompting and printing (except for the tracing) should appear in the script; if any of your code printed something, you will have to see your TA for a regrade (removing those print statements). Row 3 shows the number of tests performed for each problem: all were batch-self check tests. Rows 4-5 show further information about the tests performed in each column. Rows 6 and beyond show the number of failed tests for each submission (a blank indicates no failed tests: equivalent to 0 failed tests). To compute the number of points for a problem/in a column, compute the percentage of successful tests and multiply it by the number of points the problem is worth. So for example, if a student failed 1 of 4 tests on a 5 point problem, he/she would receive (4-1)/4*5 = 3/4*5 = 3.75 points. Columns AB-AF show each student's cumulative score, for all the tests in each of the problems in this assignment. Columns AG-Ai show each student's cumulative Score, the score Rounded to an integer (what integer is entered in the Grades spreadsheet) and Percent, based on the number of points the assginment is worth (50). Note that these columns are filled in both for submitters and their partners (these are the only columns filled in for partners): a partner should refer to his/her submitter's line for grading details. Students should talk to the TA for their Lab first, if they do not understand why they received the marks they did or dispute any of these marks. The best time to talk with your TA about grades is during one of your Labs, when both student and TA are physically present to examine the submission and the grade, possibly running the solution on a computer they can share. Students should examine their graded work immediately and get any regrade issues settled as soon as possible (within a week of when the grade is assigned). IMPORTANT Information about Student Grades
This assignment was designed to provide you with a good grounding in the use of the standard data structures in Python: list, tuple, set, and especially dict (and the defaultdict variant). It also included practice iterating overs such structures, writing comprehensions, use of the sorted function and lambda, and other useful/important Python functions. Unlike Quiz #1, the problems were bigger, requiring more interesting algorithms to solve, but still all expressible with a small number of Python language features. All these topics were tested on In-Lab Exam #1 (I assume students did well on the exam because they learned the material here and in Quiz #1) are will be tested again on the Midterm. As with all assignments, you should examine my solutions. I hope the "tracing" requirements for some of the problems showed you how to instrument the code you write to aid in debugging: if you added the tracing code after your program was running correctly, you missed the point of this part of the assignment. |
#10: 4/18/22 Quiz #2 Graded |
I have run the automatic batch self-check tests for Quiz #2 and the grades are
now recorded.
I used the following batch self-check and
data files
See the assignment grades and Grades(zipped .xlsm file) files, whose details are discussed below, in Announcement #7. The class average was about 93% and the median was 100%, meaning that most students correctly solved most problems; about (78%) of the class correctly solved all the problems (or had minor deductions). Overall there were 78% As, 3% Bs, 8% Cs, and 11% Ds and Fs; last quarter there were 86% As, 2% Bs, 4% Cs, and 8% Ds and Fs. About 48% of the students submitted early and these early submitters scored a much better (102% average) than students submitting on the due day (84% average). I am assuming that some students ran out of time before they finished all the problems, and will plan to get started earlier on later quizzes. In the assignment grades spreadsheet, Column A contains the ID Hashed of all students (in sorted order). Column B contains an X if we believe the student submitted work on time; Column C shows any extra credit point for submitting early: early means that the Checkmate submission was 1 or more days early. Row 1 for Columns D-L shows how many points the problems were worth. Row 2 shows the number of tests performed for each problem. Rows 4 and beyond show the number of failed tests for each student (a blank indicates no failed tests: equivalent to 0 failed tests). To compute the number of points for a problem/in a column, compute the percentage of successful tests and multiply it by the number of points the problem is worth. So for example, if a student missed 5 of 20 tests on a 4 point problem, he/she would receive (20-5)/20*4 = 15/20*4 = 3 points. Columns M-O show each student's cumulative Score, the score Rounded to an integer (what integer is entered in the Grades spreadsheet) and Percent, based on the number of points the assginment is worth (here 25). Students should talk to the TA for their Lab first, if they do not understand why they received the marks they did or dispute any of these marks. The best time to talk with your TA about grades is during one of your Labs, when both student and TA are physically present to examine the submission and the grade, possibly running the solution on a computer they can share. Students should examine their graded work immediately and get any regrade issues settled as soon as possible (within a week of when the grade is assigned). IMPORTANT Information about Student Grades
|
#9: 4/15/22 In-Lab Programming Exam #1 Graded |
I have run the automatic batch self-check tests for In-Lab Exam #1 and the
grades are now recorded.
They were similar tests as you were provided: if you wrote reasonable code that
worked during the exam, your code should have produced the same results for
the tests I used for grading.
See the
assignment grades and
Grades(zipped .xlsm file) files, whose
details are discussed below, in Announcement #7.
You can find my solution, and the actual bsc and data files that I used to compute grades for this assignment, by following the Solutions (Ed Resources) link for this class (see the name ile1materials.zip). To test your code with these new files, you must put them in your project folder, comment out all the tests in the script in your code (the ones you ran during the exam), and then you should add and run the following code import driver driver.default_file_name = "bscile1S22.txt" driver.driver() I believe the In-Lab Exams are the best indicator, of all testing instruments, of your ability to program: read specifications and transform them into working code (writing code and debugging it). In-Lab Exams are mastery exams: if you have a mastery of the materials you should be able to solve all these problems in the alloted time. Technically, to earn an A on this exam, I think students should be able to solve all the problems and the extra credit problem: many students did. But that was not the grading criteria. As I'll say in class, Tolstoy is often quoted (from Anna Karenina) as writing, "Happy families are all alike; every unhappy family is unhappy in its own way."My adaptation of this quote is "High-scoring students are all alike (knowing how to program well); every low-scoring student did poorly in his/her own way: e.g., lack of programming or debugging ability, freezing on the exam, misreading or misunderstanding some problem statements, spending too much time debugging one problem, being ill when taking the exam, arriving late, etc."So, I understand that there are many possible reasons that students don't do well on In-Lab Exams. If you did poorly, think about why; but, don't fool yourself. The spreadsheet computes grades as follows: if a problem passed all tests you received 20 points for it; if it failed one or more test (most problems had two tests) you scored 7 points for it; if you failed more more than 1 test you received 0 points for it. Column I computes this number, which is also the same as the rounded value (Column J) and the percentage (Column K) The result was the class average was about 89% and the median was 100%. The skew between these statistics shows that although the majority of students solved most of the problems correctly, there were other students who did very poorly, which dragged down the average much more than the median. At the extremes, 73% of the students submitted code in which all five functions passed all batch self-check tests (about 65% of those had the extra credit problem correct too); 13 students submitted code in which no functions passed any batch self-check tests The approximate distribution of grades on this In-Lab exam is 73% As, 10% Bs, 1% Cs, 7% Ds, and 9% Fs; a few years ago I gave a simlar exam and the grades were 65% As, 11% Bs, 1% Cs, 9% Ds, and 15% Fs. This U-shaped distribution (89% either As or Ds/Fs) is common for In-Lab Programming Exams, where we are testing competency/mastery of programming concepts: the ones who attained it scored As (could do everything in the allotted time); the ones who have not attained it scored Ds/Fs (solved just a few problems in the allotted time); only about 11% of the students scored somewhere in-between (all Bs/Cs). There were about 180 students (48% of the class) who solved the extra credit problem. FYI, the averages for the different exam times were 87% for students in Labs 1-3 (meeting at 10am), 89% for students in Labs 4-6 (meeting at 12noon), and 89% for students in Labs 7-9 (meeting at 2pm); 92% for students in Labs 10 (meeting at 4pm); Lab 1 (at 9am) had the highest average (96%) and Labs, 2, 3, and 9 (at 9am and 1pm) had the lowest average (82%). Remember, we didn't grade on simplicity of solutions or good use of Python: we graded just on correctness; you can still learn something by looking at my solutions. On the written exams, which are all graded by hand, we pay closer attention to simplicity and good use of Python. Students should talk to the TA for their Lab first, if they do not understand why they received the marks they did or dispute any of these marks. The best time to talk with your TA about grades is during one of your Labs, when both student and TA are physically present to examine the submission and the grade, possibly running the solution on a computer they can share. Students should examine their graded work immediately and get any regrade issues settled as soon as possible (within a week of when the grade is assigned). IMPORTANT Information about Student Grades
Finally, if students score a higher percentage on their In-Lab Exam #2 (which involves material from the first, as well as Classes, Operator Overloading, and writing Iterators), I will score their In-Lab Exam #1 higher. In the recent past, I have built a composite score that is 25% the first score and 75% the second. Therefore, even a terrible grade on this exam can have a minimal effect on your final grade if you perform much better on In-Lab Exam #2. |
#8: 4/10/22 Quiz #1 Graded |
I have run the automatic batch self-check tests for Quiz #1 (checking
correctness) and the Readers/TAs have examined problem 1 and the code
(checking requirements: e.g., 1 return statements/solution for 3a, 3b,
and 3c) and the grades are now recorded and posted.
I used the following batch self-check file.
You should run your program on this file to understand your recorded grade.
See the assignment grades and Grades(zipped .xlsm file) files, whose details are discussed below, in Announcement #7. The class average was about 83% and the median was 92%, meaning that many students correctly solved most problems; in fact 24% of the students scored 100%. Overall there were 53% As, 23% Bs, 6% Cs, and 18% Ds and Fs for those students who submitted work; some of the students who scored near 0 submitted code that we could not run (see the paragraphs below for possible regrading by your TA). FYI, a recent quarter with a similar quiz had the following grades: 66% As, 18% Bs, 5% Cs, and 11% Ds and Fs for those students who submitted work. If the average of all students on any testing instrument is less than 75%, the column for that instrument will show the number of normalization points in row 8 for that column, highlighted in yellow: the number of points that need to be added to each students score so that the instrument's average is 75%: these number of points are then added to the sum of the points for each student (in column AB) to compute their grade (so some student will score more than 100%). This is the only "curving" I do when grading; I do NOT do special curving at the end of the quarter. On this testing instrument there was no normalization points, because the average was above 75%. About 46% of the students submitted early (for quizzes there is one extra credit point for submitting one day early) and these early submitters scored much better than students submitting on the due day (94% compared to 74%): a difference of 2 full grades! I am assuming that some students ran out of time before they finished all the problems, and will plan to get started earlier on later quizzes. In the assignment grades spreadsheet, Column A contains the ID Hashed of all students (in sorted order). Column B contains an X if we believe the student submitted work on time; Column C shows any extra credit point for submitting early: early means that the Checkmate submission was 1 or more days early. Row 1 for Columns D-N shows how many points the problems were worth. Some problems show points in two columns: e.g., Problem #3a has 2 points in Column G (3a/C: produced correct answers, graded by the batch self-checks) and 1 point in Column J (3a/R: the requirement of 1 return statement, graded by the staff). Any /C column relates to correctness; any /R relates to requirements. Note that you did not receive any Requirement points unless you received at least some Correctness points: you do not get credit for writing one line of code that never worked. Row 2 shows the number of batch self-check tests performed for each problem (for those checked automatically; for the other columns it is typically the number of points the problem is worth). Rows 4 and beyond show the number of failed tests for each student (a blank indicates no failed tests: equivalent to 0 failed tests). IMPORTANT: To compute the number of points you scored for a problem/in a column, compute the percentage of successful tests and multiply it by the number of points the problem is worth. So for example, if a student missed 2 of 6 tests on a 5 point problem, they would receive (6-2)/6 * 5 = 3.3333... points for that column: they got 2/3 of the tests correct for a 5 point problem, so they get 2/3 * 5 points. Columns O-Q show each student's cumulative Score, the score Rounded to an integer (that integer is the score entered in the Grades spreadsheet) and Percent, based on the number of points the assginment is worth (here 25). Requirements points for the functions were deducted only for too many statements. But many students created extra/temporary data structures that are not needed (extra names are OK; but extra/unneeded data structures are not). For exmaple, if sorted is called using any iterable, you do NOT need to first create a list of the iterable; just call sorted on the iterable directly. We will deduct points on written exams for poor/unnecessary use of Python. Look at my solutions to see how to avoid such extra data structures in future work. For the bsc testing, students should talk to the TA for their Lab first, if they do not understand why they received the marks they did or dispute any of these marks. They can access your code and re-run the grading program on it. The best time to talk with your TA about grades is during one of your Labs, when both student and TA are physically present to examine the submission and the grade, possibly running the solution on a computer they can share.
Students should examine their graded work immediately and get any regrade issues settled as soon as possible (within a week of when the grade is assigned). Show up to lab and settle these issues immediately. IMPORTANT Information about Student Grades
|
#7: 4/4/22 Programming Assignment #0 Graded |
The TAs/Readers have graded (and I have recorded the grades for) Programming
Assignment #0.
As with most assignments, there are two files that you should download, unzip,
and examine to understand your performance on this assignment, and your
cumulative performance in this class.
Learn to download, unzip, and read these files now, so you will know how to
do it for all the later assignments.
Both of these files are sorted by Hashed IDs (which are computed from the 8-digit UCI IDs of all the students in the class). To determine your Hashed ID, see Message #6a below.
IMPORTANT: Scores will soon revert to 0, if I do not receive a signed Academic Integrity Contract from you (we are tabulating them this week). Please submit a .pdf file to Checkmate, showing the signed/dated form. This assignment was designed to test you on whether you have mastered the basics of using Python in Eclipse, the Eclipse Debugger perspective, and batch-self-check files in the driver.py module (in courselib). It was also designed to see if you could follow instructions and ask questions: more on that below. The class average was about 9.8 (or about 98%) and the median was 10 (or about 100%). For those students submtting work, there were 78% As, 12% Bs, 6% Cs, and 4% Ds and Fs (I don't often distinguish these two non-passing grades). Last quarter there were 83% As, 10% Bs, 2% Cs, and 5% Ds and Fs. The assignment was not meant to be hard, but it was not trivial either, and there were many opportunities to lose points (and learn from your mistakes). Your work in the Eclipse/Python Integrated Developement Environment (IDE) throughout the quarter will leverage off the understanding and skills that you acquired in this assignment. Let me talk about what will probably be the most contentious half point of the 1,000 points that this course is worth (thus, only .05% of the grade; so you can still get 99.95% of the points in this course). I took off .5 points if you corrected the misspelling Inteprxter (or had anything other than Inteprxter). When some students hear about this point deduction, their heads explode and they cannot believe that I am taking off a point for correcting what you thought was my mistake. But... I am trying to foster an atmosphere where nothing is taken for granted in the instructions that I give: if anything seems confusing or plain wrong, I should be questioned about it -preferably in public, in the appropriate Ed Discussion category- so others can learn if there really is a problem, and if so the correction.
Here is some insight into how the parts were graded.
Finally, about 40% of the students submitted the program 2 or more days early (their average was 110%); about 26% submitted the program 1 day early (their average was 98%). So, about 66% of the students submitted this assignment early. The other 34% of the students had an average of 85%, which was much lower. Keep up the early submissions: although it will be harder in upcoming assignments, it is doable, and it is to your advantage to try. You can earn up to 20 extra points if you turn in every Programming Assignment and Quiz early (upping your grade by 2%): for some students, this boost will be enough to raise their final letter grade. It will be to everyone's benefit -students and staff alike- if students try to finish and submit early. IMPORTANT If you believe that we graded your work incorrectly, please examine the files mentioned above first, then contact the TA/Reader who graded your work, to discuss the issues with him/her (not me, yet). Such a discussion can have only positive outcomes: either they will agree with you that you deserve more credit (and, we do want you to receive all the credit that you are due), or you will come to understand the question, program, or solution better and realize why you lost points. This is certainly a win-win situation. Please read my solution and the assignment grades spreadsheet carefully before contacting a TA/Reader; ensure that you understand what is the correct answer and what points were deducted from your assignment and why. If there is a problem, the TA/Reader will email me a revised summary about your program, and cc a copy to you. I will update the grades spreadsheet as appropriate and email you what I did. Confirm the change when I release the spreadsheet for the next graded assignment. If you feel there is still an unresolved problem after talking to a TA/Reader, please contact me (but always contact your TA/Reader first). IMPORTANT: Also, because of the size of this class, if you have a grading issue, we will consider it only if you bring it to your TA's/Reader's attention within a week of when I return the materials. This policy is in place to avoid an avalanche of work because of "grade-grubbing" late in the quarter. |
#6a: 3/28/22 Hashed ID Lookup |
When we grade assignments, we often distribute/update various spreadsheets
with the relevant grading information.
These spreadsheets are indexed and sorted by each student's Hashed ID.
The course web-page has a Find ID Hashed link (the leftmost
bottom/green link on the course web page) , which you can use to retrieve
your Hashed ID (or click
Find ID Hashed).
Use the result it shows when examining any spreadsheets of grades;
I suggest that you find this number once, and write it down for future
reference.
|
#6d: 3/28/22 Ed Discussion Signup |
Please visit Ed Discussion Signup and sign up for ICS-33. |
#5: 3/28/22 Important: Submitting Code without Losing Points |
ICS-33 uses software that automatically checks the correctnes of code in most
quizzes and programming assignments; it uses (self-checking) test cases
that we supply with the assignments that we distribute (sometimes slight
variants).
You will learn about these tools in Programming Assignment #0.
Here are a few hints to ensure that you will understand the grading process
better and minimize your point loss.
After an assignment is graded automatically, the Announcement for it will contain a link to an Excel file that you can examine for detailed information about how your score was computed. If this information does not match your expectations from you running the assignment's self-checks while developing your code, contact your TA. It is best to meet with your TA during lab hours: they can talk to you about your code and run it while you are present, to help resolve the difference. But, if we have to modify your code to grade it properly (see the typical source of problems above), then we will deduct points. I hope that by students carefully writing/submitting their code, these grading anomalies and point deductions will be minimized during the quarter. |
#4: 3/28/22 Communication |
There are many ways to communicate with me (and other staff and students).
Here is a quick overview.
Note that for questions that are not specific to you -questions that are relevant to the entire class- it is best to ask them in the appropriate Ed Discussion thread.
|
#3: 3/28/22 First/Later Labs |
I expect students to attend all their scheduled labs (unless they have
already finished the current programming assignment).
Programming Assignment #0 is assigned before the first lab of the quarter; so
if you have not already finished it, I expect you to attend your first lab
and work on it there.
Generally, you can get invaluable help in lab from the TAs and Tutors relating to
For debugging, don't expect the staff to understand your code unaided and then debug it for you. Instead, expect to explain your code to them (and answer questions about it) so that they can help you learn how to debug code in general, using your current problem/code as a concrete example. TAs/tutors will model the debugging process for you, so that you can follow it by yourself for subsequent bugs. One goal of ICS-33 is to make students much more independent programmers and debuggers: you should continually improve you debugging skills throughout the quarter. |
#2: 3/28/22 Install Course Software |
All students should download and install the course Software: Eclipse (which
installs Java, which is needed to run it) and Python.
Both products are available for free on the internet.
Students can view instructions for downloading and installing this software
by following the
Course Software
link.
If you are using a Mac, there are special instructions for you (which are a
bit out of date: I don't own a Mac): e.g., Java is already installed.
If you have installed a version of Python prior to 3.9, you should install the current version of Python (3.9 or later). If you have installed a version of Eclipse prior to 2019-06, you should install the current version of Eclipse (2021-06 or later). My PC instructions show installation of the latest versions available during Summer of 2021, so you will likely follow similar but not identical instructions. Although students can work on their programming assignments on the computers in the UCI labs, I expect students with computers to download and install this software by the end of the first week of the quarter. If you are having difficulty with this task, the TAs and Lab Tutors will help you during the first Lab meeting (or beyond, if necessary: bring your computer to the lab). If you have successfully downloaded and installed this software, please help other students do so too. Finally, you can also use the Ed Discussion threads to ask questions about installing this software and help other students install it: disclose the Logistics category. Installing software is sometimes confusing, but it is a one-time event: do it now; use it for the entire quarter. I strongly suggest that you BACKUP YOUR WORK daily: computers can malfunction, break, or be stolen. Every quarter I hear from a few students who have lost their work because they didn't backup their work; get into the backup habit now. I backup all my ICS-33 materials every day by zipping a folder that has all my ICS-33 materials and putting it on a USB memory stick. |
#1: 3/28/22 First Message |
Welcome to ICS-33.
I am going to post and archive important messages about the class in this
announcements web page: each entry will be numbered, dated, and labeled.
The entries will appear in reverse chronological order.
Whenever you follow the link to this page, scan its top for new announcements;
scan downward for older announcements.
This message will always appear at the bottom of this page.
I will never remove a message from this page
I have already posted some important messages before the start of the quarter. Expect a few new messages to be posted here each week, mostly regarding understanding returned and graded work. Check this Announcements page, along with your email, and Ed Discussion threads daily. |