Miami Dade Homepage > Learning Outcomes at MDC
Learning Outcomes - Assessment

The History


Miami Dade College determined that 1) the best approach to assess general education outcomes for a college with seven campuses and approximately 2,000 graduates for each of the Fall, Spring, and Summer semesters (approximately 6,000 graduates annually from the Associate and Baccalaureate programs) was to identify students who would be completing their general education requirements and/or be eligible to graduate at the end of a given term, 2) development of general education assessment tasks and the scoring of student responses to these tasks by faculty was critical to the assessment process, and 3) faculty scoring of the student responses by means of a rubric would provide results that would identify strengths and opportunities for improvement in student attainment of the general education outcomes.

Content:

Validation of Assessment Tasks and Scoring Rubrics
General Education Outcomes Assessment Process
Future Reporting of General Education Outcomes Assessment Results

Validation of Assessment Tasks and Scoring Rubrics

A retreat occurred September 29-30, 2006 for the faculty who represented the campuses and disciplines and who accepted the Provost for Academic and Student Affairs´ invitation to serve on the General Education Assessment Team. The retreat was facilitated by three consultants, Mr. Pat Nellis, Valencia Community College; Dr. Virginia Johnson, Towson State University; and Dr. Robert Mayes, University of Wyoming, who assisted the General Education Assessment Team in the construction of assessment tasks and scoring rubrics based on scoring rubrics utilized at other institutions and examples provided by the consultants. The faculty agreed with the suggestion of the consultants that the scoring rubric categories be positive. The faculty also determined based on examination of rubrics used by other community colleges and universities that there should be four categories — emerging, developing, proficient, and exemplary.

Most tasks developed by the faculty assessed multiple general education outcomes as indicated in the following table.

General Education Outcome Task 1 (Response on Computer Task 2 (Paper and Pencil) Task 3 (Oral Presentation) Task 4 (Paper and Pencil) Task 5 (Paper and Pencil) CSP Test
Communicate effectively using listening, speaking, reading, and writing skills. X X X X    
Use quantitative analytical skills to evaluate and process numerical data.   X        
Solve problems using critical and creative thinking and scientific reasoning. X   X X    
Demonstrate knowledge of diverse cultures, including global and historical perspectives.       X    
Create strategies that can be used to fulfill personal, civic, and social responsibilities.     X      
Demonstrate knowledge of ethical thinking and its application to issues in society. X   X      
Use computer and emerging technologies effectively.           X
Demonstrate an appreciation for aesthetics and creative activities.       X    
Describe how natural systems function and recognize the impact of humans on the environment. X X        


Students in each selected class were given one of the tasks to complete during the 50 minute class period. The content of each task is described below.

 

Task 1: Asks the student to read information about oil extraction in the local environment and respond on a computer to three prompts about the proposed drilling for oil.

 

Task 2: Asks the student to read and interpret four charts concerning international energy consumption.

 

Task 3: Asks the student to prepare and give a response orally to an ethical dilemma.

 

Task 4: Asks the student to choose one of the following groups of work: architecture, literature, or visual arts, and respond to three prompts about the creativity, beauty, and cultural perspective of the work.

 

Task 5: Asks the student to respond to three prompts about information literacy.


Based on the results of a field test conducted for the general education outcomes assessment, MDC determined that the assessment of the general education outcome pertaining to the use of emerging computer technologies was better assessed through the on-line Computer Skills Placement Test, developed and distributed by ICDL. This test is administered in the Campus Testing Departments and is also used by MDC to determine proficiency of computer skills for students who wish to test out of the MDC computer course for general education purposes.

Concurrent with the general education outcomes approval by the College Academic and Student Support Council, a general education outcomes assessment process was developed and implemented as described below:

(top)

General Education Outcomes Assessment Process

The general education outcomes assessment process adheres to this schedule.

Year 1

 

June

Faculty selected to a General Education Assessment Team (appointed by the Provost for Academic and Student Affairs) to review general education assessment tasks and scoring rubrics for each general education outcome. The Team creates working groups for this purpose.

September

Director of Learning Outcomes Assessment selects student sample of sufficient size to obtain completed tasks for at least 10% of potential term graduates. The selected students will be assessed using three of the tasks. Students will be selected from the pool of students who will have completed 100% of general education and degree requirements by the end of the Fall Semester. Classes in which students are enrolled are identified and faculty of these classes are invited to assign general education assessment tasks to the class. Invited faculty will meet and be briefed by the Provost for Academic and Student Affairs about their role. The General Education Assessment Team will also emphasize the importance of validity in the administration of the assessment tasks. During the briefing, faculty will be asked to review the tasks and rubrics so they can see specifically what they are being asked to administer.

November

The Director of Learning Outcomes Assessment in conjunction with the Academic Deans distributes General Education Assessment Tasks to selected faculty who assign the tasks to their classes.

Completed Assessment Tasks are returned by the faculty to their Campus Academic Deans, who will forward to the Director of Learning Outcomes Assessment. Tasks for selected students are pulled, names removed, and identifier codes assigned in preparation for the scoring sessions.

General Education Assessment Team through its Working Groups utilizes scoring rubrics to assess student responses to the assessment tasks. The Working Groups will consist of two faculty who are trained to score student work for each task with the scoring rubrics. In cases where faculty ratings do not agree, a third faculty will score the student work with the two common scores used to determine the final rating.

December

General Education Assessment Team completes assessment of tasks and forwards rubric assessments to the Director of Learning Outcomes Assessment.

The Director of Learning Outcomes Assessment begins to compile report of General Education Assessment results.

Year 2

 

January

Director of Learning Outcomes Assessment selects student sample of sufficient size to obtain completed tasks for at least 10% of potential term graduates. The selected students will be assessed using the remaining two tasks and the Computer Skills Placement Test. Students will be selected from the pool of students who will have completed 100% of general education and degree requirements by the end of the Fall Semester. Classes in which students are enrolled are identified and faculty of these classes are invited to assign general education assessment tasks to the class. Invited faculty will meet and be briefed by the Provost for Education about their role. The General Education Assessment Team will also convey how important this is, how they need to find ways to motivate students and establish as much consistency as possible in administration. During the briefing, faculty will be asked to review the tasks and rubrics so they can see specifically what they are being asked to administer.

February

The Director of Learning Outcomes Assessment completes compiling results from Fall Semester.

March

Assessment Results shared by the General Education Assessment and the Directors of Learning Outcomes Assessment with CASSC, Academic Deans, Student Deans, and at Conference Day. Action plans developed by the General Education Assessment Team in collaboration with the Academic Deans.

The Director of Learning Outcomes Assessment, in conjunction with the Campus Academic Deans, distributes General Education Assessment Tasks to selected faculty who assign the tasks to their classes. Completed Assessment Tasks are returned by the faculty to their Campus Academic Dean, who will forward to the Director of Learning Outcomes Assessment. Tasks for selected students are pulled, names removed, and identifier codes assigned in preparation for the scoring sessions.

General Education Assessment Team through its Working Groups utilizes scoring rubrics to assess student responses to the assessment tasks. The Working Groups will consist of two faculty who are trained to score student work for each task with the scoring rubrics. In cases where faculty ratings do not agree, a third faculty will score the student work with the two common scores used to determine the final rating.

April

General Education Assessment Team completes assessment of tasks and forwards rubric assessments to the Director of Learning Outcomes Assessment.

(top)

Future Reporting of General Education Outcomes Assessment Results

In the third year of implementation, the cycle of reporting is as follows.

Fall/Spring

General Education Outcomes Assessment Conducted.

February

Assessment Results from the previous Spring and Fall Semesters will be reported by the General Education Assessment Team with assistance from the Director of Learning Outcomes Assessment to the General Education Committee, Academic Deans, Student Deans and CASSC.

March

Assessment Results reported by the General Education Assessment Team with assistance from the Director of Learning Outcomes Assessment during Conference Day.

Spring—Summer

Action Plans developed by the General Education Committee and General Education Assessment Team in collaboration with the Academic and Student Deans.


Several indirect measures will also be considered in the assessment of general education outcomes.

Level 2 Grade Reports: The Florida state universities have developed an Academic Learning Compact to assess the domains of communication, critical thinking, and discipline content knowledge of graduates of programs offered at the state universities. The Department of Education provides to Miami Dade College a level 2 report that compares MDC transfer students and native state university students´ grade performance in their selected major. These reports will be examined by MDC.

MDC Graduate Survey: Items on the MDC graduate survey asking respondents about their attainment of general education outcomes will be considered. The survey is administered by Institutional Research.

Community College Survey of Student Engagement and Faculty Survey of Student Engagement: These national surveys contain items about student engagement with best practices that encourage student learning in a variety of areas including general education outcomes.

Schools/Disciplines Annual Reports: An item will be included in these annual reports asking the School or Discipline about its contribution to and course or sequence level assessment of general education outcomes.

Program Review: The MDC Program Review Questionnaire will include an item about assessment of general education outcomes of the program´s students.

(top)

Learning Outcomes Assessment Process Templates

 

Learning Outcomes Assessment Outcome Rubrics

 

Take Assessment

 


Miami Dade College is an equal access equal opportunity institution and does not discriminate on the basis of race, color, marital status, gender, age, religion, national origin or disability.
Contact the Office of Employee Relations/Equal Opportunity Programs/ADA Coordinator at 305.237.2051 for information.