Webinar Outlines Webinar L1: Part I (today) • • • • • • Introduction and brief overview of the strengths approach A model and framework for assessment in strengths-based TR/RT practice The ecological approach to strengths-based assessment Tools for assessment of internal and external strengths: Leisure Domain Tools for assessment of global outcomes of TR/RT services: Well-Being Questions, discussion Webinar L2: Part II (October 8) • • • • • • • Brief overview of the strengths approach and a framework for assessment from Part I Tools for assessment of internal and external strengths: Psychological/Emotional Domain Tools for assessment of internal and external strengths: Cognitive Domain Tools for assessment of internal and external strengths: Social Domain Tools for assessment of internal and external strengths: Physical Domain Tools for assessment of internal and external strengths: Spiritual Domain Questions, discussion 3
View full slide show




Webinar Outlines Webinar L1: Part I (October 1) • • • • • • Introduction and brief overview of the strengths approach A model and framework for assessment in strengths-based TR/RT practice The ecological approach to strengths-based assessment Tools for assessment of internal and external strengths: Leisure Domain Tools for assessment of global outcomes of TR/RT services: Well-Being Questions, discussion Webinar L2: Part II (today) • • • • • • • Brief overview of the strengths approach and a framework for assessment from Part I Tools for assessment of internal and external strengths: Psychological/Emotional Domain Tools for assessment of internal and external strengths: Cognitive Domain Tools for assessment of internal and external strengths: Social Domain Tools for assessment of internal and external strengths: Physical Domain Tools for assessment of internal and external strengths: Spiritual Domain Questions, discussion 3
View full slide show




Revision of Gen Ed/Core Outcomes Year: Year 2: 2017-18 Year 3: 2018-19 Year 4: 2019-20 Gen Ed Framework Endorsement of Framework Develop Gen Ed criteria (Ad Hoc Committee) Approval of Framework and Policy Core Outcomes CL Pilot Rubric Discipline Areas Develop Rubrics CL refine rubric/outcome DA Pilot Rubrics DAs refine rubrics/outcomes Approve Core Outcomes CTE realign Degree/Certs with new Core. SACs propose Gen Ed courses via new criteria SACs propose Gen Ed courses Revised Gen Ed list in catalog Gen Ed Courses Year 5: 2020-21 Year 6: 2021-22 College Wide Assessment Faculty volunteer student work for Pilot CL Rubric Faculty volunteer student work for Pilot DA Rubrics Faculty volunteer student work for new rubrics, TBD Student sample from revised Gen Ed courses Student sample all Gen Ed courses SAC Annual Assessment Core Outcome Core Outcome Core Outcome Any outcome Any outcome
View full slide show




What are Assessment Rubrics? Small Group Discussion  In small group, look at the sample rubric packet. Then, answer the following: Discussion Questions Which rubrics seem the MOST clear to you? Why?  Which rubrics are not clear? Why?  What was common among the rubrics you viewed as clear?  Based on samples, how do you define assessment rubrics?  Dr. Mary Blackinton, 01-19-08
View full slide show




• 15 rubrics specific to subject area • 5 rubrics related to planning • 5 rubrics related to instruction • 5 rubrics related to assessment • Five-level rubrics address a wide range of performance representing the knowledge and skills of a novice ranging from not ready to teach (Level 1) to the advanced practices of a highly accomplished beginner (Level 5). Evaluation Criteria
View full slide show




Going beyond the CAPM 31  The data are not entirely consistent with the CAPM.  Researchers constructed portfolios of stocks with beta=1 and ordered them by the size of the stocks they contained and checked to see if all such portfolios had average returns as predicted by the CAPM. They found that they did not – portfolios of small stocks tended, on average, to earn higher returns than portfolios of larger stocks.  Furthermore, portfolios consisting of stocks that had high book-tomarket ratios (value stocks) had higher average returns than portfolios consisting of stocks with low book-to-market ratios (growth stocks).  Momentum strategies seemed to provide positive alphas (abnormal returns) when they are adjusted only for CAPM beta risk.  These CAPM violations could be because we do not have the true market portfolio or because the CAPM itself does not fully describe expected returns. However, since these findings are stable over time, it’s likely that these factors represent sources of risk in addition to the beta as measured by using popular market indices.  Using these empirical facts, researchers have constructed a four-factor asset-pricing model.
View full slide show




Project Component Evaluation and Results  Spring 2009: effective communication rubrics in the course.  The general education committee of the university designed the rubrics with the active participation of the English faculty member who joined our course teaching.  The rubrics were introduced and explained to students during the initial presentation by the English faculty member early in the course.  We evaluated each oral presentation using these rubrics and provided each student with a detailed explanation of their evaluation.  The rubrics are presented in the Appendix of the paper
View full slide show




Principles for Using Rubrics Responsibly Use a rubric that matches your instructional goals. Acknowledge the limitations of rubrics. Remember that rubrics don’t simply measure quality; rather, they define quality. Distribute rubrics to students at the BEGINNING of the assignment. Use a variety of rubrics. (One size DOES NOT fit all!)
View full slide show




Academic Year Timeline of Marquette Assessment 2004 • HLC site visit finds significant shortcomings with assessment. • Vice Provost brought to campus to implement campus wide assessment system. 2005 • Assessment steering team created to develop a framework for interrelated assessment. • Program Assessment Leader (PAL) appointed for each program. 2006 • Knowledge Area Learning Outcomes introduced. 2007 • Integrated Core Learning Outcomes (ICLOs) introduced. • Annual assessment reporting begins. • First Peer Review Seminar. • 2008 • University Assessment Committee (UAC) directly involved in reviewing and monitoring each step of assessment implementation. 2009 • HLC Focused Visit to follow up on assessment concerns. • Core Curriculum Assessment using multiple-choice quiz on D2L. • Graduating Senior Survey data used to provide indirect measures of student proficiency with ICLOs. 2011 • Assessment Director hired. 2012 • Knowledge area learning outcome assessment cycle implemented (first full cycle to be completed in AY2014-15). • Implementation of ARMS. • Senior experience/capstone assessment grant pilot. 2013 • HLC site visit. 2014 2018 • Accreditation reaffirmed. Suggestions for curriculum mapping and benchmarking. • Report to HLC.
View full slide show




Types of Electronic Portfolios Showcase or Best Works Portfolios (Marketing) Assessment Portfolios (Summative) Learning Portfolios (Formative) From Barrett, H. C. (2001) Electronic Portfolios. Educational Technology: An Encyclopedia
View full slide show




Computer Engineering - Rubrics • ABET Criterion 3 – Student Learning Outcomes • criteria/traits ⇔ rubrics (typical performance at 3 levels) ⇔ assessment instruments ⇔ criteria/traits • Development Process • EECE Undergraduate Committee – research, draft, revise • EECE Faculty – review, advise, approve • Implementation • Posted • used with modifications by faculty in classes • used for assessment scoring, analysis, reporting • Reviewed during assessment – wrt instrument/criteria/traits • Instrument/criteria/rubric review • F - Ethical and professional standards • I - Life Long Learning • Elevate the Standard • Diffuse through the curriculum via rubrics • UCCS review • ABET • Managing the assessment process • Consistency in interpretation, application • revision of SLO from A-K to 1-6 • Suggestions for reading James O. Nichols, The Departmental Guide to Implementation of Student Outcomes Assessment and Institutional Effectiveness, Agathon Press, NY 1991 James O. Nichols, Karen W. Nichols, The Departmental Guide and Record Book for Student Outcomes and Institutional Effectiveness, to Implementation, 3rd ed., Agathon Press, NY 2001. Gloria Rogers, “Developing Rubrics”, ABET Webinar, March 2010. Slides available from http://apa.fiu.edu/documents_rubrics/CEC%20R ubrics/Developing%20Rubrics%20in%20Engineeri ng%20-ABET.pdf Google “Rubrics for <…> Majors”; “Rubrics for
View full slide show




Objectives for Day II – Discuss types of rubrics and their purpose – Examine models of rubrics – Consider advantages and disadvantages of homegrown rubrics and outside rubrics – Discuss Primary Traits Analysis – Determine Rubric for Individual Assessment Report – Finalize plan for Summer Individual Assessment, including rubric
View full slide show




Is the CAPM true? 59 One way to check this out is to look at whether the expected returns on assets are linearly related to their betas, i.e. does the CAPM hold? Furthermore, if the CAPM holds for single assets, this relationship must hold for portfolios of assets as well. Researchers (e.g. Banz) constructed portfolios of stocks and ordered them by the size of the stocks they contained and checked to see if all such portfolios lay on the Security Market Line. They found that they did not – portfolios of small stocks tended, on average, to earn higher returns than portfolios of larger stocks.
View full slide show




Efficiency of the Market Model 7 If the market portfolio is efficient, then we get a similar equation with the market portfolio replacing portfolio P. The expected return equation in this case is exactly the CAPM. Even though it is not unreasonable to assume that the market portfolio is efficient, since market risk is pervasive and unavoidable, this is not logically necessary. Hence we have to check whether the market portfolio is, in fact, efficient. One way to check this out is to look at whether the expected returns on assets are linearly related to their betas, i.e. does the CAPM hold? Furthermore, if the CAPM holds for single assets, this relationship must hold for portfolios of assets as well. Researchers (e.g. Banz) constructed portfolios of stocks and ordered them by the size of the stocks they contained and checked to see if all such portfolios lay on the Security Market Line. They found that they did not – portfolios of small stocks tended, on average, to earn higher returns than portfolios of larger stocks.
View full slide show




WHAT PROGRESS HAVE WE MADE IN ASSESSING OUR PROGRAMS? Year SDSU SLO Committee Feedback 2009 “Pleased with substantial improvement in both BSBA and MSBA but… - “develop pool of questions that will be randomly selected for assessment of program goals within courses” - “develop improved rubrics so both instructors and SDSU SLO Committee students will understand the Feedback scale” “Pleased with the improved rubrics but get to work on the pool of questions !” Year 2010 IDS Department Loop Closing *Improved descriptors on rubrics to more clearly enunciate levels of achievement *Unable to develop pool of questions in time for 2010 assessment cycle. This item is highest on our priorities to address related to our assessment activities. IDS Department Loop Closing *This will be addressed at next faculty curriculum/assessment meeting
View full slide show




Highly referenced text • Stevens, D., & Levia, A. (2005). Introduction to rubrics: An assessment tool to save grading time, convey effective feedback and promote student learning. Sterling, VA: Stylus Publishing. • Also ppt available @ http:// www.introductiontorubrics.com/index.html Sites to help in rubric makers: • RubiStar – free tool to help teachers create quality rubrics. http://rubistar.4teachers.org/ • Rcampus (personal) – Over 30,000 rubrics shared and build one electronically https://www.rcampus.com/ • teAchnology – rubric makers http://www.teach-nology.com/web_tools/rubrics/
View full slide show




FUNDAMENTAL #3: METRICS • Assessment Metrics – A scoring mechanism (that isn’t simple grading or SEIs) – Scaled Rubrics, checklist rubrics, grade-to-slo-conversions, holistic scoring guides, custom rubrics or scoring guides – The GEF program will be using the sixteen Value rubrics that were developed in tandem with the LEAP outcomes – LEAP outcome #1 doesn’t seem to have a rubric… • Task 3: Identify which Value rubric you will use to score the assignment WEST VIRGINIA UNIVERSITY DEPT NAME
View full slide show




Finding the Gen Ed Assessment/Scoring Rubrics •The rubrics reside within eLearn in the General Education Rubric Repository. •Every faculty member has access to the Repository and the rubrics. •The rubrics can be easily uploaded directly into your eLearn course shells.
View full slide show




13 Years IL Assessment: after the Internet: Instruments, WherePortfolios Goes Information & FocusLiteracy? Group 5 IL Sweeney Assessment; Instruments, Portfolios & Focus Group Richard [email protected] 973-596-3208 IL Assessment: Instruments, Portfolios & Focus Group 8498 Richard Sweeney & Haymwantee Singh [email protected] Sweeney & Haymwantee Singh [email protected] 973-596-3208 973-596-3208/ /8498 Richard Sweeney [email protected] 973-596-3208 Who is IL assessment for? 1. Assessment should inform the learner about how he/she is skilled compared to other learners. 2. Assessment should inform the institution how their students are collectively performing. 3. Assessment should comparatively inform outside agencies and the public how that institution is performing. Carr, Nicholas. “Is Google Making Us Stupid?: what the Internet is doing to our brains”. Atlantic Monthly. 301:6 July/August 2008 Internet Language
View full slide show




ASSESSMENT METHOD Reflects best practices Meets standards Needs development Reviewer comments or suggestions The assessment method describes, in detail: ☒ what the data source is (scores from exams, surveys, presentations, etc.) ☐ how the data will be gathered and by whom ☐ how often/when the data will be gathered ☒ who will evaluate/score it ☐ what the evaluation scale is (%? SD – SA? 0-5? P/F?) ☒ the criteria for acceptable performance (e.g., 85% pass rate, 75% score, 80% agree or strongly agree) ☒ who will review the results and when they will be reviewed   ☐ The assessment isolates useful data* about the target learning outcome from other information.   ☐ The assessment method is practical (i.e., it can be implemented with existing time and resources).   ☐ All information is provided.   ☐ The method includes sufficient detail to easily understand whether the assessment is appropriate for measuring the target learning outcome(s).   ☐ The assessment isolates useful data* about the target learning outcome from other information.   *Useful data means that your scores, responses, results, etc. are at an appropriate level of detail to provide information about just one learning outcome and provide an indication about what the program should retain or change.   ☐ The assessment is practical.           ☐ All information is provided, but some details need clarification.   ☐ The assessment isolates useful data about the target learning outcome from other information.   ☐ The assessment is practical.     ☒ Not all information is provided.  or  ☒ Many details need clarification.  or  ☐ The assessment does not provide useful data about the target learning outcome. (e.g., retention rates (as data) don’t reveal whether students write well (where writing well is the target learning outcome))  or  ☐ The assessment does not isolate data about the target learning outcome from other information. (In most cases, course grades as a data source fall under this category.)  or  ☐ The assessment is not practical.   It’s unclear whether the data will be useful or whether it’s practical to gather. One category of the rubric is used for two outcomes , so it doesn’t completel y isolate data.
View full slide show