Comprehensive Standard: The institution identifies expected outcomes for its educational programs (including student learning outcomes for educational programs) and its administrative and educational support services; assesses whether it achieves these outcomes; and provides evidence of improvement based on analysis of those results.
X Compliance
Partial Compliance
Non-compliance
Narrative:
Georgia State University is committed to providing educational programs and administrative and educational support services which set high standards, to assessing whether the desired outcomes have been achieved, and to using the results of those assessments to improve. The University uses parallel processes for academic and administrative units which require each to identify desired outcomes, assess whether the desired targets were met, and develop action plans to improve future performance.
Educational Programs
Each educational major has developed an assessment plan in which they have specified what students must know and do in order to graduate. The pilot versions of the assessment plans are available online. [1] Assessment plans and results for the 2005-2006 and 2006-2007 assessment cycles are available on the WEAVEonline site, the assessment management program used by the University [2]. Academic programs are encouraged to use direct assessments of learning and to incorporate effective assessment practices, especially the use of course-embedded assessments.
For the 2005-2006 and 2006-2007 academic years, academic programs reported:
- their program’s desired outcomes (especially student learning outcomes),
- how the outcomes linked to the University’s Strategic Plan and institutional priorities,
- measures for assessing the outcomes,
- targets for acceptable performance,
- findings from the measures,
- action plans for improving student learning based on the assessment results, and
- an analysis of the program’s strengths and areas requiring continued monitoring.
For the 2005-2006 assessment cycle, all 170 academic programs posted their mission statements, learning outcomes, and measures for assessing the outcomes. All programs but one posted findings from their assessments, and 93% of the programs included actions plans which described steps they would take to improve. As of July 13, 2007, data for the 171 academic programs in the 2006-2007 assessment cycle showed that in addition to all programs having mission statements, learning outcomes, and measures, 94% had posted assessment findings. It is anticipated that this number will be close to 100% by September 1.
There is evidence that programs move to more effective strategies for assessing student learning as a result of their annual efforts. In the most recent reports, 35 programs mentioned using portfolios as part of their measures of student learning (more than 150 individual references). This is up from the 32 programs which reported using portfolios in 2005-2006.
All assessment reports may be examined at the WEAVEonline web site [2], and an example of an assessment report is provided to illustrate the format (Geology BS) [3]. Please note that reports for the bachelor’s degree program in the Robinson College of Business consist of two parts: the generic college core outcomes and the program-specific outcomes.
Examples of improvements
There are a number of examples of actions taken by programs as a result of the assessment process, departmental discussions, and feedback provided from the Office of the Associate Provost for Institutional Effectiveness. As a result of the assessment process, the following action plans were implemented:
- the Criminal Justice Department reevaluated its capstone examination to ensure that the content of the examination reflects the substantive information taught in the criminal justice curriculum;
- The PhD program in the Communication Department reports improvement in the level of student presenting their work at national conferences as a result of implementing a related action plan.
- the Department of Economics added a capstone course for its undergraduate majors;
- the Geology Department created a Geoscience Learning Community to promote critical thinking in its students;
- The Finance Department reported successful implementation of their MS degree action plan to customize programs of study provided students who had demonstrated prior exposure to materials in the management science areas.
- the Department of Political Science successfully implemented a program of supplemental instruction in its large section undergraduate courses;
- the Psychology Department considered ways to evaluate non-course learning experiences and related data in the overall undergraduate program instruction assessment process, such as internships, directed study projects, Model UN/Model Arab League, Mock Trial, study abroad programs, and similar programs;
- the Department of Public Administration and Urban Studies, based on their assessment reports, is reconfiguring their undergraduate major in Urban Policy Studies to have a public policy major with new specializations.
Administrative Programs
A parallel process has been developed to set standards, assess progress, and develop action plans for administrative and student support units. Assessment plans and results for the 2005-2006 and 2006-2007 assessment cycles are available on the WEAVEonline site, the assessment management program used by the University. [2] While the academic process for assessing student learning outcomes served as a model, the assessment paradigm developed for the administrative and support units accommodates the disparate responsibilities of the administrative units. Although most of these units do not have student learning outcomes, the underlying tenet (that an outcome reflects what the student learns, not what was taught) was mirrored in defining an administrative outcome as the impact that unit has on its customers, constituents or the institution (as distinct from a strategic goal or objective).
The 90 administrative and student support units spread across six divisions of the institution developed initial assessment plans in the Spring of 2004. These included identifying expected outcomes, measures for those outcomes, targets for success on each measure, and plans for collecting and utilizing performance data. The units were provided feedback from the Associate Provost for Institutional Effectiveness before they started their data collection at the beginning of the fiscal year (July 1, 2004).
The first round of reporting (for fiscal year 2005) took place between July 1 and September 1, 2005. At this point, the units described and interpreted their results and indicated how the information had been or would be used to improve performance. Again, all units received feedback on their reporting efforts. For the second round of reporting (for fiscal year 2006), the University converted to the online system (WEAVE online) described above. The reports were completed between July 1 and September, 2006.
An estimate of the broad-based participation of faculty and staff in the assessment process can be based on the fact that nearly 250 different individuals have logged on to the assessment reporting website to either enter or review the information for their unit. Many other individuals collected and reviewed their units’ assessment information prior to its actually being posted.
Examples of improvements
A number of improvements in administrative programs can be cited as a result of setting outcomes, assessing performance and developing action plans. The following are examples:
- The responsibility for maintaining and troubleshooting problems with computer equipment in the classrooms was transferred from the educational programming department of the Information and Systems Technology to their customer service department to facilitate customer utilization. Faculty now call the same number for help with their office-based equipment and classroom equipment.
- In University Career Services, the Peer Advisor program was so successful (as demonstrated by their IE measures) that the program was expanded to an internship program.
- Results from Disability Services’ IE measures triggered the development of a Student Success Plan form that is used at intake for every client. The plan explicitly addresses the student’s limitations, accommodations needed, and potential resources that could be tapped.
- The Office of International Affairs instituted a formal process for soliciting, reviewing, and awarding internal grants. The internal granting program has been very successful in attracting external funding—about $1m in FY06, the year the process was introduced.
- The staff in Auxiliary and Support Services worked with their vending machine vendor to increase the vendor’s ability to identify problems with the machines’ card readers (that read students’ University debit cards). The department also purchased spare parts for the machines to have on site so that the vendors can change parts or repair equipment on the spot, thus decreasing the down-time on the vending machines.
Supporting Documentation:
- 2004-2005 learning assessment plans and reports
- WEAVEonline assessment site
- Example of a learning outcomes assessment report (Geology BS, 2006)