Reflective Comments about the 2015 CEAT Summer Bridge Program

As I have repeatedly discussed, I had the privilege of teaching in the 2015 Oklahoma State University College of Engineering, Architecture, and Technology’s Summer Bridge Program. Over the three weeks of the program, I worked with a number of excellent students to help prepare them not only for the writing they will do as first-year students at the school, but also the writing they can expect to do later in their collegiate careers and as professionals afterwards. While I offered comments throughout the program on daily progress, a final set of reflective comments seems to be in order. In what follows, I offer a breakdown of my class’s composition and performance before summarizing my impressions of the experience and what implications it has for my continued teaching. Included afterward is a copy of the lecture notes I compiled while conducting lecture; how useful they will be without context, I am unsure, but they are provided, nonetheless.

Class Demographics

At the beginning of the program, I had 29 students enrolled in the one section of the program’s technical writing class assigned to me. On the final day of the program, I had 25 still with me; five of my students withdrew from the program, and one added. A survey of the students remaining in the program, one conducted anonymously through a Google form and offering a small grade reward (as noted here) returned 24 results. Questions on the survey asked after student age, gender, race (working from 2010 US Census Bureau definitions), ethnicity (ibid.), socioeconomic status, major, minor, and GPA, and offered open-response questions regarding the conduct of the course as a whole.

Student ages clustered around 18, with 21 of the 24 respondents indicating it as their age. One reported being 17; two reported being 19. As the Summer Bridge Program is intended to help incoming first-year students, the ages reported are not surprising; they correspond to the largely traditional undergraduate student body at the institution.

Nineteen respondents identified as male, four as female; one opted not to self-identify gender. (Options given were “female,” “intersex,” “male,” “trans,” “prefer not to identify,” and “other”. The attempt was made to be both inclusive and respectful of self-identification. Suggestions for how to better handle future attempts are welcome.)

Twenty-one respondents identified as White, eight as American Indian or Alaska Native, three as Asian, one as Native Hawaiian or Other Pacific Islander, and one as Black or African-American. Students were allowed to select multiple categories; that some did so is certain. Only two respondents self-identified as Hispanic; no respondents opted against ethnic self-identification.

Socioeconomic class self-identification was self-determined; respondents left open-response answers. Most responded with some variation of “middle class,” with three opting not to self-identify. Students were asked to elaborate; only one did, indicating “upper middle-class” status as a result of differences in parental salaries and work.

Thirteen students reported majoring in Mechanical & Aerospace Engineering, the clear majority. (Capitalization of majors and minors is offered to clarify fields of study.) Three reported majoring in Chemical Engineering, and three others reported majoring in Electrical & Computer Engineering. Two reported majoring in each of Architecture, Mechanical Engineering Technology, and an undefined Other. One reported majoring in Civil & Environmental Engineering, one other in Industrial Engineering & Management.

Minors were left as an open-ended question. Many respondents noted not intending to take a minor or being uncertain about doing so. Of those who offered affirmative responses about minors, five indicated opting for a minor in a business field. One reported opting for each of Psychology, Computer Science, Biosystems, Chemistry, Japanese, and Mechanical Engineering.

All 24 respondents indicated being incoming freshmen. Two thirds report having no prior college GPA; one eighth of the respondents report already having a GPA of between 3.0 and 3.499, and slightly more than a fifth report already having a GPA above 3.5. One third of the respondents appear to have already earned college credit.

Information regarding course content will be reported to the Program directorate for review and adjustment of curricula moving forward. It may also be used in other professional development capacities.

Return to top.

Class Performance

Of the 25 students remaining enrolled in my section of the Program writing component, the performance of eleven was assessed by an outside grader hired by the Program; the other fourteen were assessed by the instructor of the course. Although the Program does not report figures for calculation of GPA, it does measure student performance internally, using the data in part for scholarship awards; tracked were student attendance and performance on assignments assessed against program-standard rubrics on the United States-traditional percentile scale (i.e., 90%+ earns an A, 80-89% a B, etc.).

Attendance was most frequently determined by a sign-in sheet, as daily reports of class activities attest. Of the 25 students remaining enrolled, 19 attended all course meetings. Five incurred one absence, and one incurred two.

Of the 25 students remaining enrolled, four earned the equivalent of an A; the high score was a 93.9. Fifteen earned the equivalent of a B, four a C, one a D, and one an F; the low score was a 58.4. Average course score was 83.526. Low scores resulted in most cases from failure to submit one or more assignments.

Return to top.

Impressions and Implications

Overall, the experience of teaching in the Summer Bridge Program was a good one. The students seemed to benefit, and the exercise was enjoyable in itself. (The two do not necessarily coincide.) Other online commentaries have expressed a desire to see the Program, or programs like it, expanded, as students coming into other fields of study are also likely to benefit from the kinds of things offered.

In assessing my students’ performance, I made a point of writing several hundred words of commentary in response to the submissions I received; typically, I provided between 200 and 300 words commenting on each assignment. A number of students expressed their gratitude for that effort via email and in the responses to the aforementioned survey–even as I did not offer line-by-line proofreading commentaries, which has been my common practice. I have seen many students complain of the lack of line-by-line “correction,” which I tend to resist as not giving students the opportunity to practice doing so for themselves. I have seen few who seem to appreciate–or, as happened many times in the Program, work to incorporate–the stylistic and other non-“grammatical” comments I leave. That I have seen evidence that my comments have done some good encourages me to continue to make them.

In those selfsame comments, I was able to work out better explanations for some of the principles of writing I hope to convey to my students–particularly those in the upcoming Fall 2015 term, in which I will be teaching composition exclusively. I had already had some ways of expressing those principles, although less effective than I could have hoped, as students tended not to reflect understanding them. Perhaps the revised presentation will do more to motivate students to adjust their work in favor of the new information. (That I am able thus to model writing to learn also pleases me.)

If the opportunity arises, I will gladly teach for the Program again.

Return to top.

Summer 2015 CEAT Summer Bridge Writing Teaching Page

Advertisement

3 thoughts on “Reflective Comments about the 2015 CEAT Summer Bridge Program

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s